Molmo is a family of open vision-language models developed by the Allen Institute for AI. Molmo models are trained on PixMo, a dataset of 1 million, highly-curated image-text pairs. It has state-of-the-art performance among multimodal models with a similar size while being fully open-source. You can find all models in the Molmo family
here
.
Learn more
about the Molmo family
in our announcement blog post
.
Molmo 7B-D is based on
Qwen2-7B
and uses
OpenAI CLIP
as vision backbone.
It performs comfortably between GPT-4V and GPT-4o on both academic benchmarks and human evaluation.
It powers the
Molmo demo at
molmo.allenai.org
.
This checkpoint is a
preview
of the Molmo release. All artifacts used in creating Molmo (PixMo dataset, training code, evaluations, intermediate checkpoints) will be made available at a later date, furthering our commitment to open-source AI development and reproducibility.
Sign up here
to be the first to know when artifacts are released.
# uninstall all tensorflow packages
pip list --format=freeze | grep '^tensorflow' | cut -d= -f1 | xargs -n1 pip uninstall -y
# install CPU-only version of tensorflow; used for image preprocessing
pip install einops tensorflow-cpu torchvision
Then, follow these steps:
from transformers import AutoModelForCausalLM, AutoProcessor, GenerationConfig
from PIL import Image
import requests
# load the processor
processor = AutoProcessor.from_pretrained(
'allenai/Molmo-7B-D-0924',
trust_remote_code=True,
torch_dtype='auto',
device_map='auto'
)
# load the model
model = AutoModelForCausalLM.from_pretrained(
'allenai/Molmo-7B-D-0924',
trust_remote_code=True,
torch_dtype='auto',
device_map='auto'
)
# process the image and text
inputs = processor.process(
images=[Image.open(requests.get("https://picsum.photos/id/237/536/354", stream=True).raw)],
text="Describe this image."
)
# move inputs to the correct device and make a batch of size 1
inputs = {k: v.to(model.device).unsqueeze(0) for k, v in inputs.items()}
# generate output; maximum 200 new tokens; stop generation when <|endoftext|> is generated
output = model.generate_from_batch(
inputs,
GenerationConfig(max_new_tokens=200, stop_strings="<|endoftext|>"),
tokenizer=processor.tokenizer
)
# only get generated tokens; decode them to text
generated_tokens = output[0,inputs['input_ids'].size(1):]
generated_text = processor.tokenizer.decode(generated_tokens, skip_special_tokens=True)
# print the generated textprint(generated_text)
# >>> This image features an adorable black Labrador puppy, captured from a top-down# perspective. The puppy is sitting on a wooden deck, which is composed ...
Evaluations
Model
Average Score on 11 Academic Benchmarks
Human Preference Elo Rating
Molmo 72B
81.2
1077
Molmo 7B-D (this model)
77.3
1056
Molmo 7B-O
74.6
1051
MolmoE 1B
68.6
1032
GPT-4o
78.5
1079
GPT-4V
71.1
1041
Gemini 1.5 Pro
78.3
1074
Gemini 1.5 Flash
75.1
1054
Claude 3.5 Sonnet
76.7
1069
Claude 3 Opus
66.4
971
Claude 3 Haiku
65.3
999
Qwen VL2 72B
79.4
1037
Qwen VL2 7B
73.7
1025
Intern VL2 LLAMA 76B
77.1
1018
Intern VL2 8B
69.4
953
Pixtral 12B
69.5
1016
Phi3.5-Vision 4B
59.7
982
PaliGemma 3B
50.0
937
LLAVA OneVision 72B
76.6
1051
LLAVA OneVision 7B
72.0
1024
Cambrian-1 34B
66.8
953
Cambrian-1 8B
63.4
952
xGen - MM - Interleave 4B
59.5
979
LLAVA-1.5 13B
43.9
960
LLAVA-1.5 7B
40.7
951
Benchmarks: AI2D test, ChartQA test, VQA v2.0 test, DocQA test, InfographicVQA test, TextVQA val, RealWorldQA, MMMU val, MathVista testmini, CountBenchQA, Flickr Count (we collected this new dataset that is significantly harder than CountBenchQA).
License and Use
This model is licensed under Apache 2.0. It is intended for research and educational use.
For more information, please see our
Responsible Use Guidelines
.
Runs of allenai Molmo-7B-D-0924 on huggingface.co
77.3K
Total runs
998
24-hour runs
-5.2K
3-day runs
-22.5K
7-day runs
-517.1K
30-day runs
More Information About Molmo-7B-D-0924 huggingface.co Model
Molmo-7B-D-0924 huggingface.co is an AI model on huggingface.co that provides Molmo-7B-D-0924's model effect (), which can be used instantly with this allenai Molmo-7B-D-0924 model. huggingface.co supports a free trial of the Molmo-7B-D-0924 model, and also provides paid use of the Molmo-7B-D-0924. Support call Molmo-7B-D-0924 model through api, including Node.js, Python, http.
Molmo-7B-D-0924 huggingface.co is an online trial and call api platform, which integrates Molmo-7B-D-0924's modeling effects, including api services, and provides a free online trial of Molmo-7B-D-0924, you can try Molmo-7B-D-0924 online for free by clicking the link below.
allenai Molmo-7B-D-0924 online free url in huggingface.co:
Molmo-7B-D-0924 is an open source model from GitHub that offers a free installation service, and any user can find Molmo-7B-D-0924 on GitHub to install. At the same time, huggingface.co provides the effect of Molmo-7B-D-0924 install, users can directly use Molmo-7B-D-0924 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.