This repository contains
Chinese-Mixtral-Instruct
, which is further tuned with instruction data on
Chinese-Mixtral
, where Chinese-Mixtral is build on top of
Mixtral-8x7B-v0.1
.
Note: this is an instruction (chat) model, which can be used for conversation, QA, etc.
@article{chinese-mixtral,
title={Rethinking LLM Language Adaptation: A Case Study on Chinese Mixtral},
author={Cui, Yiming and Yao, Xin},
journal={arXiv preprint arXiv:2403.01851},
url={https://arxiv.org/abs/2403.01851},
year={2024}
}
Runs of hfl chinese-mixtral-instruct on huggingface.co
1.2K
Total runs
0
24-hour runs
16
3-day runs
-3
7-day runs
45
30-day runs
More Information About chinese-mixtral-instruct huggingface.co Model
chinese-mixtral-instruct huggingface.co is an AI model on huggingface.co that provides chinese-mixtral-instruct's model effect (), which can be used instantly with this hfl chinese-mixtral-instruct model. huggingface.co supports a free trial of the chinese-mixtral-instruct model, and also provides paid use of the chinese-mixtral-instruct. Support call chinese-mixtral-instruct model through api, including Node.js, Python, http.
chinese-mixtral-instruct huggingface.co is an online trial and call api platform, which integrates chinese-mixtral-instruct's modeling effects, including api services, and provides a free online trial of chinese-mixtral-instruct, you can try chinese-mixtral-instruct online for free by clicking the link below.
hfl chinese-mixtral-instruct online free url in huggingface.co:
chinese-mixtral-instruct is an open source model from GitHub that offers a free installation service, and any user can find chinese-mixtral-instruct on GitHub to install. At the same time, huggingface.co provides the effect of chinese-mixtral-instruct install, users can directly use chinese-mixtral-instruct installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
chinese-mixtral-instruct install url in huggingface.co: