opt-babylm1-ntb_seed-211_5e-6
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.9435
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 32
- eval_batch_size: 64
- seed: 211
- gradient_accumulation_steps: 8
- total_train_batch_size: 256
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 20.0
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 5.3217 | 0.4284 | 1000 | 5.2905 |
| 4.6744 | 0.8567 | 2000 | 4.6693 |
| 4.1935 | 1.2849 | 3000 | 4.2015 |
| 3.8318 | 1.7132 | 4000 | 3.8223 |
| 3.5351 | 2.1414 | 5000 | 3.5414 |
| 3.3899 | 2.5697 | 6000 | 3.4023 |
| 3.3056 | 2.9981 | 7000 | 3.3055 |
| 3.2184 | 3.4262 | 8000 | 3.2395 |
| 3.1734 | 3.8546 | 9000 | 3.1929 |
| 3.1132 | 4.2827 | 10000 | 3.1553 |
| 3.0852 | 4.7111 | 11000 | 3.1295 |
| 3.0321 | 5.1392 | 12000 | 3.1052 |
| 3.0214 | 5.5676 | 13000 | 3.0839 |
| 3.0117 | 5.9959 | 14000 | 3.0623 |
| 2.9681 | 6.4241 | 15000 | 3.0529 |
| 2.9665 | 6.8524 | 16000 | 3.0381 |
| 2.9238 | 7.2806 | 17000 | 3.0308 |
| 2.9227 | 7.7089 | 18000 | 3.0208 |
| 2.8758 | 8.1371 | 19000 | 3.0110 |
| 2.8916 | 8.5654 | 20000 | 3.0054 |
| 2.8922 | 8.9938 | 21000 | 2.9924 |
| 2.8511 | 9.4219 | 22000 | 2.9935 |
| 2.8613 | 9.8503 | 23000 | 2.9845 |
| 2.8167 | 10.2784 | 24000 | 2.9839 |
| 2.8325 | 10.7068 | 25000 | 2.9777 |
| 2.7947 | 11.1349 | 26000 | 2.9783 |
| 2.8071 | 11.5633 | 27000 | 2.9698 |
| 2.8076 | 11.9916 | 28000 | 2.9640 |
| 2.7805 | 12.4198 | 29000 | 2.9682 |
| 2.7866 | 12.8481 | 30000 | 2.9573 |
| 2.7508 | 13.2763 | 31000 | 2.9638 |
| 2.7604 | 13.7046 | 32000 | 2.9565 |
| 2.7272 | 14.1328 | 33000 | 2.9582 |
| 2.7412 | 14.5611 | 34000 | 2.9528 |
| 2.7479 | 14.9895 | 35000 | 2.9502 |
| 2.718 | 15.4176 | 36000 | 2.9522 |
| 2.727 | 15.8460 | 37000 | 2.9468 |
| 2.7014 | 16.2741 | 38000 | 2.9490 |
| 2.714 | 16.7025 | 39000 | 2.9457 |
| 2.6815 | 17.1306 | 40000 | 2.9478 |
| 2.6908 | 17.5590 | 41000 | 2.9464 |
| 2.6907 | 17.9874 | 42000 | 2.9430 |
| 2.6768 | 18.4155 | 43000 | 2.9448 |
| 2.6741 | 18.8439 | 44000 | 2.9431 |
| 2.6652 | 19.2720 | 45000 | 2.9441 |
| 2.6664 | 19.7004 | 46000 | 2.9435 |
Framework versions
- Transformers 4.54.0
- Pytorch 2.10.0+cu128
- Datasets 3.2.0
- Tokenizers 0.21.4
- Downloads last month
- 62