Molmo2-ER-SAT / README.md
Duanj1's picture
Initial release: array/SAT subset for Molmo2-ER training
421ec9e verified
---
license: mit
language:
- en
pretty_name: Molmo2-ER SAT
tags:
- embodied-reasoning
- molmo2
- molmo2-er
- vlm-training-data
---
# Molmo2-ER · array/SAT
Spatial Aptitude Training: 175K binary-MCQ VQA pairs over ProcTHOR indoor scenes.
This is a re-hosted, **loader-ready subset** of the upstream dataset, used to train [`allenai/Molmo2-ER-4B`](https://huggingface.co/allenai/Molmo2-ER-4B). Files mirror the upstream layout; nothing in the data has been modified.
## Upstream source
- **Original dataset:** [array/SAT](https://huggingface.co/datasets/array/SAT)
- **Paper:** *SAT: Dynamic Spatial Aptitude Training for Multimodal Language Models* ([arXiv:2412.07755](https://arxiv.org/abs/2412.07755))
- **License:** `mit` (inherits from upstream)
If you use this data, please cite the original authors:
```bibtex
@misc{ray2025satdynamicspatialaptitude,
title={SAT: Dynamic Spatial Aptitude Training for Multimodal Language Models},
author={Arijit Ray and Jiafei Duan and Ellis Brown and others},
year={2025},
eprint={2412.07755},
archivePrefix={arXiv}
}
```
## Usage in Molmo2-ER
See the [`allenai/molmo2`](https://github.com/allenai/molmo2) repository for the data loader and training recipe. The relevant loader class for this dataset lives in `olmo/data/spatial_datasets.py`.