nobinomial-babylm-base_seed-42_1e-4
This model was trained from scratch on the babylm_export dataset. It achieves the following results on the evaluation set:
- Loss: 3.1396
- Accuracy: 0.4068
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 256
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 20.0
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 3.8857 | 1.0 | 1659 | 3.9011 | 0.3332 |
| 3.553 | 2.0 | 3318 | 3.5920 | 0.3604 |
| 3.4021 | 3.0 | 4977 | 3.4457 | 0.3744 |
| 3.2955 | 4.0 | 6636 | 3.3591 | 0.3832 |
| 3.2272 | 5.0 | 8295 | 3.3037 | 0.3888 |
| 3.1643 | 6.0 | 9954 | 3.2646 | 0.3926 |
| 3.1206 | 7.0 | 11613 | 3.2355 | 0.3957 |
| 3.0781 | 8.0 | 13272 | 3.2125 | 0.3980 |
| 3.0467 | 9.0 | 14931 | 3.1951 | 0.3999 |
| 3.023 | 10.0 | 16590 | 3.1822 | 0.4012 |
| 2.985 | 11.0 | 18249 | 3.1736 | 0.4023 |
| 2.9666 | 12.0 | 19908 | 3.1633 | 0.4035 |
| 2.9494 | 13.0 | 21567 | 3.1590 | 0.4041 |
| 2.9322 | 14.0 | 23226 | 3.1527 | 0.4048 |
| 2.9092 | 15.0 | 24885 | 3.1487 | 0.4054 |
| 2.8954 | 16.0 | 26544 | 3.1451 | 0.4059 |
| 2.8763 | 17.0 | 28203 | 3.1436 | 0.4061 |
| 2.8644 | 18.0 | 29862 | 3.1418 | 0.4065 |
| 2.8584 | 19.0 | 31521 | 3.1403 | 0.4067 |
| 2.8417 | 20.0 | 33180 | 3.1396 | 0.4068 |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 4
Evaluation results
- Accuracy on babylm_exportself-reported0.407