GPT-2 124M trained with FineWeb-Edu 10B

A 124M parameter GPT2 model trained with Fineweb-Edu 10B. The training took nearly 20 hours on two A40 GPUs.

This model has been pushed to the Hub using the PytorchModelHubMixin integration:

  • Library: [More Information Needed]
  • Docs: [More Information Needed]
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support