allenai / OLMoE-1B-7B-0924

huggingface.co
Total runs: 23.4K
24-hour runs: -344
7-day runs: 55
30-day runs: 1.4K
Model's Last Updated: 2024年10月19日
text-generation

Introduction of OLMoE-1B-7B-0924

Model Details of OLMoE-1B-7B-0924

OLMoE Logo.

Model Summary

OLMoE-1B-7B is a Mixture-of-Experts LLM with 1B active and 7B total parameters released in September 2024 (0924). It yields state-of-the-art performance among models with a similar cost (1B) and is competitive with much larger models like Llama2-13B. OLMoE is 100% open-source.

This information and more can also be found on the OLMoE GitHub repository .

Use

Install transformers from source until a release after this PR & torch and run:

from transformers import OlmoeForCausalLM, AutoTokenizer
import torch

DEVICE = "cuda" if torch.cuda.is_available() else "cpu"

# Load different ckpts via passing e.g. `revision=step10000-tokens41B`
model = OlmoeForCausalLM.from_pretrained("allenai/OLMoE-1B-7B-0924").to(DEVICE)
tokenizer = AutoTokenizer.from_pretrained("allenai/OLMoE-1B-7B-0924")
inputs = tokenizer("Bitcoin is", return_tensors="pt")
inputs = {k: v.to(DEVICE) for k, v in inputs.items()}
out = model.generate(**inputs, max_length=64)
print(tokenizer.decode(out[0]))
# > # Bitcoin is a digital currency that is created and held electronically. No one controls it. Bitcoins aren’t printed, like dollars or euros – they’re produced by people and businesses running computers all around the world, using software that solves mathematical

You can list all revisions/branches by installing huggingface-hub & running:

from huggingface_hub import list_repo_refs
out = list_repo_refs("OLMoE/OLMoE-1B-7B-0924")
branches = [b.name for b in out.branches]

Important branches:

  • step1200000-tokens5033B : Pretraining checkpoint used for annealing. There are a few more checkpoints after this one but we did not use them.
  • main : Checkpoint annealed from step1200000-tokens5033B for an additional 100B tokens (23,842 steps). We use this checkpoint for our adaptation ( https://huggingface.co/allenai/OLMoE-1B-7B-0924-SFT & https://huggingface.co/allenai/OLMoE-1B-7B-0924-Instruct ).
  • fp32 : FP32 version of main . The model weights were stored in FP32 during training but we did not observe any performance drop from casting them to BF16 after training so we upload all weights in BF16. If you want the original FP32 checkpoint for main you can use this one. You will find that it yields slightly different results but should perform around the same on benchmarks.

Evaluation Snapshot

Model Active Params Open Data MMLU HellaSwag ARC-Chall. ARC-Easy PIQA WinoGrande
LMs with ~1B active parameters
OLMoE-1B-7B 1.3B 54.1 80.0 62.1 84.2 79.8 70.2
DCLM-1B 1.4B 48.5 75.1 57.6 79.5 76.6 68.1
TinyLlama-1B 1.1B 33.6 60.8 38.1 69.5 71.7 60.1
OLMo-1B (0724) 1.3B 32.1 67.5 36.4 53.5 74.0 62.9
Pythia-1B 1.1B 31.1 48.0 31.4 63.4 68.9 52.7
LMs with ~2-3B active parameters
Qwen1.5-3B-14B 2.7B 62.4 80.0 77.4 91.6 81.0 72.3
Gemma2-3B 2.6B 53.3 74.6 67.5 84.3 78.5 71.8
JetMoE-2B-9B 2.2B 49.1 81.7 61.4 81.9 80.3 70.7
DeepSeek-3B-16B 2.9B 45.5 80.4 53.4 82.7 80.1 73.2
StableLM-2B 1.6B 40.4 70.3 50.6 75.3 75.6 65.8
OpenMoE-3B-9B 2.9B 27.4 44.4 29.3 50.6 63.3 51.9
LMs with ~7-9B active parameters
Gemma2-9B 9.2B 70.6 87.3 89.5 95.5 86.1 78.8
Llama3.1-8B 8.0B 66.9 81.6 79.5 91.7 81.1 76.6
DCLM-7B 6.9B 64.4 82.3 79.8 92.3 80.1 77.3
Mistral-7B 7.3B 64.0 83.0 78.6 90.8 82.8 77.9
OLMo-7B (0724) 6.9B 54.9 80.5 68.0 85.7 79.3 73.2
Llama2-7B 6.7B 46.2 78.9 54.2 84.0 77.5 71.7

Citation

@misc{muennighoff2024olmoeopenmixtureofexpertslanguage,
      title={OLMoE: Open Mixture-of-Experts Language Models}, 
      author={Niklas Muennighoff and Luca Soldaini and Dirk Groeneveld and Kyle Lo and Jacob Morrison and Sewon Min and Weijia Shi and Pete Walsh and Oyvind Tafjord and Nathan Lambert and Yuling Gu and Shane Arora and Akshita Bhagia and Dustin Schwenk and David Wadden and Alexander Wettig and Binyuan Hui and Tim Dettmers and Douwe Kiela and Ali Farhadi and Noah A. Smith and Pang Wei Koh and Amanpreet Singh and Hannaneh Hajishirzi},
      year={2024},
      eprint={2409.02060},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2409.02060}, 
}

Runs of allenai OLMoE-1B-7B-0924 on huggingface.co

23.4K
Total runs
-344
24-hour runs
-552
3-day runs
55
7-day runs
1.4K
30-day runs

More Information About OLMoE-1B-7B-0924 huggingface.co Model

More OLMoE-1B-7B-0924 license Visit here:

https://choosealicense.com/licenses/apache-2.0

OLMoE-1B-7B-0924 huggingface.co

OLMoE-1B-7B-0924 huggingface.co is an AI model on huggingface.co that provides OLMoE-1B-7B-0924's model effect (), which can be used instantly with this allenai OLMoE-1B-7B-0924 model. huggingface.co supports a free trial of the OLMoE-1B-7B-0924 model, and also provides paid use of the OLMoE-1B-7B-0924. Support call OLMoE-1B-7B-0924 model through api, including Node.js, Python, http.

OLMoE-1B-7B-0924 huggingface.co Url

https://huggingface.co/allenai/OLMoE-1B-7B-0924

allenai OLMoE-1B-7B-0924 online free

OLMoE-1B-7B-0924 huggingface.co is an online trial and call api platform, which integrates OLMoE-1B-7B-0924's modeling effects, including api services, and provides a free online trial of OLMoE-1B-7B-0924, you can try OLMoE-1B-7B-0924 online for free by clicking the link below.

allenai OLMoE-1B-7B-0924 online free url in huggingface.co:

https://huggingface.co/allenai/OLMoE-1B-7B-0924

OLMoE-1B-7B-0924 install

OLMoE-1B-7B-0924 is an open source model from GitHub that offers a free installation service, and any user can find OLMoE-1B-7B-0924 on GitHub to install. At the same time, huggingface.co provides the effect of OLMoE-1B-7B-0924 install, users can directly use OLMoE-1B-7B-0924 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

OLMoE-1B-7B-0924 install url in huggingface.co:

https://huggingface.co/allenai/OLMoE-1B-7B-0924

Url of OLMoE-1B-7B-0924

OLMoE-1B-7B-0924 huggingface.co Url

Provider of OLMoE-1B-7B-0924 huggingface.co

allenai
ORGANIZATIONS

Other API from allenai

huggingface.co

Total runs: 91.7K
Run Growth: 78.6K
Growth Rate: 85.70%
Updated: 2023年10月18日
huggingface.co

Total runs: 91.3K
Run Growth: 63.3K
Growth Rate: 69.57%
Updated: 2025年1月6日
huggingface.co

Total runs: 77.3K
Run Growth: -517.1K
Growth Rate: -669.13%
Updated: 2024年10月10日
huggingface.co

Total runs: 63.3K
Run Growth: 51.7K
Growth Rate: 81.63%
Updated: 2024年10月10日
huggingface.co

Total runs: 61.6K
Run Growth: -50.5K
Growth Rate: -81.96%
Updated: 2024年12月4日
huggingface.co

Total runs: 23.0K
Run Growth: 7.7K
Growth Rate: 33.79%
Updated: 2024年8月14日
huggingface.co

Total runs: 8.5K
Run Growth: 3.3K
Growth Rate: 36.78%
Updated: 2024年7月16日
huggingface.co

Total runs: 6.4K
Run Growth: 3.3K
Growth Rate: 51.01%
Updated: 2024年10月10日
huggingface.co

Total runs: 6.1K
Run Growth: -21.5K
Growth Rate: -354.06%
Updated: 2024年7月3日
huggingface.co

Total runs: 5.1K
Run Growth: -17.0K
Growth Rate: -321.48%
Updated: 2024年7月16日
huggingface.co

Total runs: 3.7K
Run Growth: 1.7K
Growth Rate: 44.61%
Updated: 2024年5月14日
huggingface.co

Total runs: 2.5K
Run Growth: -163
Growth Rate: -6.49%
Updated: 2024年12月4日
huggingface.co

Total runs: 1.7K
Run Growth: -110
Growth Rate: -6.43%
Updated: 2024年7月16日
huggingface.co

Total runs: 895
Run Growth: 878
Growth Rate: 98.10%
Updated: 2023年1月24日
huggingface.co

Total runs: 502
Run Growth: -100
Growth Rate: -21.23%
Updated: 2023年1月24日
huggingface.co

Total runs: 486
Run Growth: 256
Growth Rate: 52.67%
Updated: 2024年2月12日
huggingface.co

Total runs: 374
Run Growth: 354
Growth Rate: 94.65%
Updated: 2024年6月13日
huggingface.co

Total runs: 313
Run Growth: -437
Growth Rate: -139.62%
Updated: 2024年4月30日
huggingface.co

Total runs: 297
Run Growth: 159
Growth Rate: 53.54%
Updated: 2024年4月19日