SmolLM baselines trained from scratch
Collection
2 items • Updated • 1
Model created for the paper "Preferences for Idiomatic Language are Acquired Slowly --- and Forgotten Quickly: A Case Study on Swedish", TACL 2026.
@misc{kunz2026preferencesidiomaticlanguageacquired,
title={Preferences for Idiomatic Language are Acquired Slowly -- and Forgotten Quickly: A Case Study on Swedish},
author={Jenny Kunz},
year={2026},
eprint={2602.03484},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2602.03484},
}
This is a model with the same architecture and configuration as SmolLM2-135M trained from scratch on the Swedish portion of Fineweb-2.
This is a research model intended for studying pre-training dynamics and I do not recommend using it for any practical purposes. It is trained on a web corpus, and no alignment whatsoever has been performed, which means that the model will likely reflect its training data's biases and produce lots of hallucinations.