This model is an intermediate training checkpoint during post-training, after the Supervised Fine-Tuning (SFT) step. For best performance, we recommend you use the
OLMoE-Instruct
version.
@misc{muennighoff2024olmoeopenmixtureofexpertslanguage,
title={OLMoE: Open Mixture-of-Experts Language Models},
author={Niklas Muennighoff and Luca Soldaini and Dirk Groeneveld and Kyle Lo and Jacob Morrison and Sewon Min and Weijia Shi and Pete Walsh and Oyvind Tafjord and Nathan Lambert and Yuling Gu and Shane Arora and Akshita Bhagia and Dustin Schwenk and David Wadden and Alexander Wettig and Binyuan Hui and Tim Dettmers and Douwe Kiela and Ali Farhadi and Noah A. Smith and Pang Wei Koh and Amanpreet Singh and Hannaneh Hajishirzi},
year={2024},
eprint={2409.02060},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2409.02060},
}
Runs of allenai OLMoE-1B-7B-0924-SFT on huggingface.co
103
Total runs
0
24-hour runs
-118
3-day runs
-146
7-day runs
-188
30-day runs
More Information About OLMoE-1B-7B-0924-SFT huggingface.co Model
OLMoE-1B-7B-0924-SFT huggingface.co is an AI model on huggingface.co that provides OLMoE-1B-7B-0924-SFT's model effect (), which can be used instantly with this allenai OLMoE-1B-7B-0924-SFT model. huggingface.co supports a free trial of the OLMoE-1B-7B-0924-SFT model, and also provides paid use of the OLMoE-1B-7B-0924-SFT. Support call OLMoE-1B-7B-0924-SFT model through api, including Node.js, Python, http.
OLMoE-1B-7B-0924-SFT huggingface.co is an online trial and call api platform, which integrates OLMoE-1B-7B-0924-SFT's modeling effects, including api services, and provides a free online trial of OLMoE-1B-7B-0924-SFT, you can try OLMoE-1B-7B-0924-SFT online for free by clicking the link below.
allenai OLMoE-1B-7B-0924-SFT online free url in huggingface.co:
OLMoE-1B-7B-0924-SFT is an open source model from GitHub that offers a free installation service, and any user can find OLMoE-1B-7B-0924-SFT on GitHub to install. At the same time, huggingface.co provides the effect of OLMoE-1B-7B-0924-SFT install, users can directly use OLMoE-1B-7B-0924-SFT installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
OLMoE-1B-7B-0924-SFT install url in huggingface.co: