CINO: Pre-trained Language Models for Chinese Minority Languages(中国少数民族预训练模型)
Multilingual Pre-trained Language Model, such as mBERT, XLM-R, provide multilingual and cross-lingual ability for language understanding.
We have seen rapid progress on building multilingual PLMs in recent year.
However, there is a lack of contributions on building PLMs on Chines minority languages, which hinders researchers from building powerful NLP systems.
To address the absence of Chinese minority PLMs, Joint Laboratory of HIT and iFLYTEK Research (HFL) proposes CINO (Chinese-miNOrity pre-trained language model), which is built on XLM-R with additional pre-training using Chinese minority corpus, such as
cino-large-v2 huggingface.co is an AI model on huggingface.co that provides cino-large-v2's model effect (), which can be used instantly with this hfl cino-large-v2 model. huggingface.co supports a free trial of the cino-large-v2 model, and also provides paid use of the cino-large-v2. Support call cino-large-v2 model through api, including Node.js, Python, http.
cino-large-v2 huggingface.co is an online trial and call api platform, which integrates cino-large-v2's modeling effects, including api services, and provides a free online trial of cino-large-v2, you can try cino-large-v2 online for free by clicking the link below.
hfl cino-large-v2 online free url in huggingface.co:
cino-large-v2 is an open source model from GitHub that offers a free installation service, and any user can find cino-large-v2 on GitHub to install. At the same time, huggingface.co provides the effect of cino-large-v2 install, users can directly use cino-large-v2 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.