Korean sentence embedding repository. You can download the pre-trained models and inference right away, also it provides environments where individuals can train models.
@misc{park2021klue,
title={KLUE: Korean Language Understanding Evaluation},
author={Sungjoon Park and Jihyung Moon and Sungdong Kim and Won Ik Cho and Jiyoon Han and Jangwon Park and Chisung Song and Junseong Kim and Yongsook Song and Taehwan Oh and Joohong Lee and Juhyun Oh and Sungwon Lyu and Younghoon Jeong and Inkwon Lee and Sangwoo Seo and Dongjun Lee and Hyunwoo Kim and Myeonghwa Lee and Seongbo Jang and Seungwon Do and Sunkyoung Kim and Kyungtae Lim and Jongwon Lee and Kyumin Park and Jamin Shin and Seonghyun Kim and Lucy Park and Alice Oh and Jung-Woo Ha and Kyunghyun Cho},
year={2021},
eprint={2105.09680},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@inproceedings{gao2021simcse,
title={{SimCSE}: Simple Contrastive Learning of Sentence Embeddings},
author={Gao, Tianyu and Yao, Xingcheng and Chen, Danqi},
booktitle={Empirical Methods in Natural Language Processing (EMNLP)},
year={2021}
}
@article{ham2020kornli,
title={KorNLI and KorSTS: New Benchmark Datasets for Korean Natural Language Understanding},
author={Ham, Jiyeon and Choe, Yo Joong and Park, Kyubyong and Choi, Ilji and Soh, Hyungjoon},
journal={arXiv preprint arXiv:2004.03289},
year={2020}
}
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
@inproceedings{chuang2022diffcse,
title={{DiffCSE}: Difference-based Contrastive Learning for Sentence Embeddings},
author={Chuang, Yung-Sung and Dangovski, Rumen and Luo, Hongyin and Zhang, Yang and Chang, Shiyu and Soljacic, Marin and Li, Shang-Wen and Yih, Wen-tau and Kim, Yoon and Glass, James},
booktitle={Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL)},
year={2022}
}
Runs of BM-K KoSimCSE-Unsup-RoBERTa on huggingface.co
9
Total runs
0
24-hour runs
0
3-day runs
3
7-day runs
3
30-day runs
More Information About KoSimCSE-Unsup-RoBERTa huggingface.co Model
KoSimCSE-Unsup-RoBERTa huggingface.co
KoSimCSE-Unsup-RoBERTa huggingface.co is an AI model on huggingface.co that provides KoSimCSE-Unsup-RoBERTa's model effect (), which can be used instantly with this BM-K KoSimCSE-Unsup-RoBERTa model. huggingface.co supports a free trial of the KoSimCSE-Unsup-RoBERTa model, and also provides paid use of the KoSimCSE-Unsup-RoBERTa. Support call KoSimCSE-Unsup-RoBERTa model through api, including Node.js, Python, http.
KoSimCSE-Unsup-RoBERTa huggingface.co is an online trial and call api platform, which integrates KoSimCSE-Unsup-RoBERTa's modeling effects, including api services, and provides a free online trial of KoSimCSE-Unsup-RoBERTa, you can try KoSimCSE-Unsup-RoBERTa online for free by clicking the link below.
BM-K KoSimCSE-Unsup-RoBERTa online free url in huggingface.co:
KoSimCSE-Unsup-RoBERTa is an open source model from GitHub that offers a free installation service, and any user can find KoSimCSE-Unsup-RoBERTa on GitHub to install. At the same time, huggingface.co provides the effect of KoSimCSE-Unsup-RoBERTa install, users can directly use KoSimCSE-Unsup-RoBERTa installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
KoSimCSE-Unsup-RoBERTa install url in huggingface.co: