zixianma02 commited on
Commit
d5ad3eb
·
verified ·
1 Parent(s): dfcddcf

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -0
README.md ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ base_model:
6
+ - Qwen/Qwen3-8B
7
+ - google/siglip-so400m-patch14-384
8
+ pipeline_tag: image-text-to-text
9
+ library_name: transformers
10
+ tags:
11
+ - multimodal
12
+ - olmo
13
+ - molmo
14
+ - molmo2
15
+ ---
16
+
17
+ <img src="molmoweb_logo.png" alt="Logo for the MolmoWeb Project" style="width: auto; height: 50px;">
18
+
19
+ # MolmoWeb-8B-Native
20
+
21
+ **Note** that this is the molmo-native PRE-PRETRAINED checkpoint from which we start MolmoWeb SFT, and it's NOT Huggingface/transformers-compatible. Check out [allenai/MolmoWeb-8B](https://huggingface.co/allenai/MolmoWeb-8B) for HF-compatible FINETUNED checkpoint.
22
+
23
+ MolmoWeb is a family of fully open multimodal web agents. MolmoWeb agents achieve state-of-the-art results outperforming similar scale open-weight-only
24
+ models such as Fara-7B, UI-Tars-1.5-7B, and Holo1-7B. MolmoWeb-8B also surpasses set-of-marks
25
+ (SoM) agents built on much larger closed frontier models like GPT-4o. We further demonstrate
26
+ consistent gains through test-time scaling via parallel rollouts with best-of-N selection, achieving 94.7%
27
+ and 60.5% pass@4 (compared to 78.2% and 35.3% pass@1)on WebVoyager and Online-Mind2Web
28
+ respectively.
29
+
30
+ **Learn more** about the MolmoWeb family in our announcement [blog post](https://allenai.org/blog/molmoweb) and [tech report](https://allenai.org/papers/molmoweb).
31
+
32
+ MolmoWeb-8B-Native is based on [Molmo2](https://arxiv.org/abs/2601.10611) architecture, which uses [Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) and [SigLIP 2](https://huggingface.co/google/siglip-so400m-patch14-384) as vision backbone.
33
+
34
+ Ai2 is committed to open science. The MolmoWeb datasets are available [here](https://huggingface.co/collections/allenai/molmoweb-data).
35
+ All other artifacts used in creating MolmoWeb (training code, [evaluations](https://github.com/allenai/molmoweb), intermediate checkpoints) will be made available, furthering our commitment to open-source AI development and reproducibility.
36
+
37
+ Quick links:
38
+ - 💬 [Demo](https://molmoweb.allen.ai/)
39
+ - 📂 [All Models](https://huggingface.co/collections/allenai/molmoweb)
40
+ - 📚 [All Data](https://huggingface.co/collections/allenai/molmoweb-data)
41
+ - 📃 [Paper](https://allenai.org/papers/molmoweb)
42
+ - 🎥 [Blog with Videos](https://allenai.org/blog/molmoweb)
43
+
44
+ ## Usage
45
+ Please refer to our [Github repo](https://github.com/allenai/molmoweb/) for inference code.
46
+
47
+ ## License and Use
48
+
49
+ This model is licensed under Apache 2.0. It is intended for research and educational use in accordance with Ai2’s [Responsible Use Guidelines](https://allenai.org/responsible-use).