language:-neplicense:apache-2.0tags:-sentence-transformers-sentence-similarity-feature-extraction-generated_from_trainer-dataset_size:1046-loss:MatryoshkaLoss-loss:MultipleNegativesRankingLossbase_model:jangedoo/all-MiniLM-L6-v2-nepaliwidget:-source_sentence:राहदानीकोलागिकागजातसत्यापनमाकस्तोमनोनयनपत्रचाहिन्छ?sentences:->- सिम्यान्स अभिलेख किताबको लागि निवेदन फाराम अनुसूची-२क बमोजिमको ढाँचामा आधारित हुन्छ।->- कुटनीतिक वा विशेष राहदानीको लागि कागजात सत्यापनमा सम्बन्धित पदमा नियुक्तिको मनोनयनपत्रको प्रमाणित प्रतिलिपि चाहिन्छ।-राहदानीरद्दगर्नमहानिर्देशकलेस्वीकृतिदिन्छ।-source_sentence:राहदानीवितरणमात्रुटिसच्याउनकतिसमयलाग्छ?sentences:->- राहदानी नियमावली, २०७७ मा अभिलेखको गोपनीयताको उल्लङ्घनको जाँचको नतिजाको अपीलको नतिजाको कार्यान्वयनको अभिलेख बाह्र वर्षसम्म राखिन्छ।->- राहदानी वितरणमा त्रुटि सच्याउन सामान्यतः सात कार्यदिन लाग्छ, तर प्रक्रिया जटिल भएमा बढी समय लाग्न सक्छ।->- राहदानीको लागि निवेदनमा जाँच गर्ने अधिकारीको नाम, सही, पद, र मिति उल्लेख गर्नुपर्छ।-source_sentence:राहदानीकोलागिनिवेदनमाकस्तोआवेदनस्रोतउल्लेखगर्नुपर्छ?sentences:->- राहदानीको लागि निवेदनमा आवेदन स्रोत (विभाग, जिल्ला, वा नियोग) उल्लेख गर्नुपर्छ।->- राहदानी बुझाउने प्रक्रियामा त्रुटि सच्याउन सामान्यतः सात कार्यदिन लाग्छ, तर प्रक्रिया जटिल भएमा बढी समय लाग्न सक्छ।->- राहदानीको लिए अनलाइन निवेदनमा निकटतम व्यक्तिसँगको सम्बन्ध (Relationship) उल्लेख गर्नुपर्छ।-source_sentence:विशेषराहदानीकसलाईजारीगरिन्छ?sentences:-राहदानीरद्दगर्नबाहकवासम्बन्धितनिकायकोलिखितनिवेदनचाहिन्छ।->- राहदानी नियमावली, २०७७ मा अभिलेखको गोपनीयताको उल्लङ्घनको जाँचको नतिजाको अपीलको लागि जाँच गर्ने अधिकारीको नाम, सही, पद, र मिति उल्लेख गर्नुपर्छ।->- विशेष राहदानी नगरपालिकाका प्रमुख, सहसचिव, जिल्ला न्यायाधीश, प्रदेश लोकसेवा आयोगका सदस्य, लगायतका पदाधिकारीलाई जारी गरिन्छ।-source_sentence:कुटनीतिकराहदानीकोलागिनिवेदनमाकस्तोठेगानाविवरणचाहिन्छ?sentences:->- कुटनीतिक राहदानीको लागि निवेदनमा जिल्ला, गाउँ/नगरपालिका, वडा नम्बर, गाउँ/सडक, र घर नम्बरको ठेगाना विवरण चाहिन्छ।->- राहदानीको लागि कागजात धुल्याउने प्रक्रिया महानिर्देशकको स्वीकृतिमा हुन्छ।-राहदानीकोविद्युतीयअभिलेखअनुसूची-७बमोजिमकोढाँचामाआधारितहुन्छ।pipeline_tag:sentence-similaritylibrary_name:sentence-transformersmetrics:-cosine_accuracy@1-cosine_accuracy@3-cosine_accuracy@5-cosine_accuracy@10-cosine_precision@1-cosine_precision@3-cosine_precision@5-cosine_precision@10-cosine_recall@1-cosine_recall@3-cosine_recall@5-cosine_recall@10-cosine_ndcg@10-cosine_mrr@10-cosine_map@100model-index:-name:sentenceTransformer_nepali_embeddingresults:-task:type:information-retrievalname:InformationRetrievaldataset:name:dim384type:dim_384metrics:-type:cosine_accuracy@1value:0.41025641025641024name:CosineAccuracy@1-type:cosine_accuracy@3value:0.6581196581196581name:CosineAccuracy@3-type:cosine_accuracy@5value:0.7350427350427351name:CosineAccuracy@5-type:cosine_accuracy@10value:0.8461538461538461name:CosineAccuracy@10-type:cosine_precision@1value:0.41025641025641024name:CosinePrecision@1-type:cosine_precision@3value:0.21937321937321935name:CosinePrecision@3-type:cosine_precision@5value:0.14700854700854699name:CosinePrecision@5-type:cosine_precision@10value:0.0846153846153846name:CosinePrecision@10-type:cosine_recall@1value:0.41025641025641024name:CosineRecall@1-type:cosine_recall@3value:0.6581196581196581name:CosineRecall@3-type:cosine_recall@5value:0.7350427350427351name:CosineRecall@5-type:cosine_recall@10value:0.8461538461538461name:CosineRecall@10-type:cosine_ndcg@10value:0.6218282635615644name:CosineNdcg@10-type:cosine_mrr@10value:0.5504409171075837name:CosineMrr@10-type:cosine_map@100value:0.5571750406212126name:CosineMap@100-task:type:information-retrievalname:InformationRetrievaldataset:name:dim256type:dim_256metrics:-type:cosine_accuracy@1value:0.42735042735042733name:CosineAccuracy@1-type:cosine_accuracy@3value:0.6410256410256411name:CosineAccuracy@3-type:cosine_accuracy@5value:0.717948717948718name:CosineAccuracy@5-type:cosine_accuracy@10value:0.8290598290598291name:CosineAccuracy@10-type:cosine_precision@1value:0.42735042735042733name:CosinePrecision@1-type:cosine_precision@3value:0.21367521367521364name:CosinePrecision@3-type:cosine_precision@5value:0.14358974358974358name:CosinePrecision@5-type:cosine_precision@10value:0.08290598290598289name:CosinePrecision@10-type:cosine_recall@1value:0.42735042735042733name:CosineRecall@1-type:cosine_recall@3value:0.6410256410256411name:CosineRecall@3-type:cosine_recall@5value:0.717948717948718name:CosineRecall@5-type:cosine_recall@10value:0.8290598290598291name:CosineRecall@10-type:cosine_ndcg@10value:0.6159996592171239name:CosineNdcg@10-type:cosine_mrr@10value:0.5487959571292905name:CosineMrr@10-type:cosine_map@100value:0.5563599760664051name:CosineMap@100-task:type:information-retrievalname:InformationRetrievaldataset:name:dim128type:dim_128metrics:-type:cosine_accuracy@1value:0.39316239316239315name:CosineAccuracy@1-type:cosine_accuracy@3value:0.5811965811965812name:CosineAccuracy@3-type:cosine_accuracy@5value:0.6752136752136753name:CosineAccuracy@5-type:cosine_accuracy@10value:0.8034188034188035name:CosineAccuracy@10-type:cosine_precision@1value:0.39316239316239315name:CosinePrecision@1-type:cosine_precision@3value:0.19373219373219372name:CosinePrecision@3-type:cosine_precision@5value:0.135042735042735name:CosinePrecision@5-type:cosine_precision@10value:0.08034188034188033name:CosinePrecision@10-type:cosine_recall@1value:0.39316239316239315name:CosineRecall@1-type:cosine_recall@3value:0.5811965811965812name:CosineRecall@3-type:cosine_recall@5value:0.6752136752136753name:CosineRecall@5-type:cosine_recall@10value:0.8034188034188035name:CosineRecall@10-type:cosine_ndcg@10value:0.5799237272193319name:CosineNdcg@10-type:cosine_mrr@10value:0.5100054266720935name:CosineMrr@10-type:cosine_map@100value:0.5176470843483384name:CosineMap@100-task:type:information-retrievalname:InformationRetrievaldataset:name:dim64type:dim_64metrics:-type:cosine_accuracy@1value:0.38461538461538464name:CosineAccuracy@1-type:cosine_accuracy@3value:0.5811965811965812name:CosineAccuracy@3-type:cosine_accuracy@5value:0.6410256410256411name:CosineAccuracy@5-type:cosine_accuracy@10value:0.7606837606837606name:CosineAccuracy@10-type:cosine_precision@1value:0.38461538461538464name:CosinePrecision@1-type:cosine_precision@3value:0.1937321937321937name:CosinePrecision@3-type:cosine_precision@5value:0.12820512820512817name:CosinePrecision@5-type:cosine_precision@10value:0.07606837606837605name:CosinePrecision@10-type:cosine_recall@1value:0.38461538461538464name:CosineRecall@1-type:cosine_recall@3value:0.5811965811965812name:CosineRecall@3-type:cosine_recall@5value:0.6410256410256411name:CosineRecall@5-type:cosine_recall@10value:0.7606837606837606name:CosineRecall@10-type:cosine_ndcg@10value:0.565217766093051name:CosineNdcg@10-type:cosine_mrr@10value:0.5036663953330621name:CosineMrr@10-type:cosine_map@100value:0.5140223584530523name:CosineMap@100
sentenceTransformer_nepali_embedding
This is a sentence-transformers model finetuned from jangedoo/all-MiniLM-L6-v2-nepali on the json dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("ritesh-07/fine_tuned_model_02")
# Run inference
sentences = [
'कुटनीतिक राहदानीको लागि निवेदनमा कस्तो ठेगाना विवरण चाहिन्छ?',
'कुटनीतिक राहदानीको लागि निवेदनमा जिल्ला, गाउँ/नगरपालिका, वडा नम्बर, गाउँ/सडक, र घर नम्बरको ठेगाना विवरण चाहिन्छ।',
'राहदानीको लागि कागजात धुल्याउने प्रक्रिया महानिर्देशकको स्वीकृतिमा हुन्छ।',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MatryoshkaLoss
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}