embc25_finetuned_30000_en_es
This model is a fine-tuned version of Kyungjin-Kim/mmc_roberta_500000_en_es on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4518
- Accuracy: 0.8402
- Precision: 0.8343
- Recall: 0.849
- F1: 0.8416
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|---|---|---|---|---|---|---|---|
| 0.4753 | 0.5926 | 500 | 0.4780 | 0.7733 | 0.7433 | 0.835 | 0.7865 |
| 0.3803 | 1.1849 | 1000 | 0.4106 | 0.8153 | 0.7945 | 0.8507 | 0.8216 |
| 0.3456 | 1.7775 | 1500 | 0.3910 | 0.8272 | 0.8290 | 0.8243 | 0.8267 |
| 0.266 | 2.3698 | 2000 | 0.4186 | 0.83 | 0.8764 | 0.7683 | 0.8188 |
| 0.2813 | 2.9624 | 2500 | 0.3928 | 0.8368 | 0.8471 | 0.822 | 0.8344 |
| 0.2037 | 3.5547 | 3000 | 0.4518 | 0.8402 | 0.8343 | 0.849 | 0.8416 |
| 0.1333 | 4.1470 | 3500 | 0.5118 | 0.8355 | 0.8327 | 0.8397 | 0.8362 |
| 0.1429 | 4.7396 | 4000 | 0.5276 | 0.8392 | 0.8404 | 0.8373 | 0.8389 |
| 0.0954 | 5.3319 | 4500 | 0.6366 | 0.8363 | 0.8446 | 0.8243 | 0.8343 |
| 0.0996 | 5.9244 | 5000 | 0.6372 | 0.8327 | 0.8148 | 0.861 | 0.8373 |
| 0.0698 | 6.5167 | 5500 | 0.7315 | 0.8352 | 0.8270 | 0.8477 | 0.8372 |
| 0.0545 | 7.1090 | 6000 | 0.7958 | 0.8357 | 0.8499 | 0.8153 | 0.8323 |
| 0.052 | 7.7016 | 6500 | 0.8584 | 0.8337 | 0.8382 | 0.827 | 0.8326 |
| 0.0403 | 8.2939 | 7000 | 0.9279 | 0.833 | 0.8284 | 0.84 | 0.8342 |
| 0.037 | 8.8865 | 7500 | 0.9500 | 0.8335 | 0.8191 | 0.856 | 0.8372 |
| 0.0268 | 9.4788 | 8000 | 1.0100 | 0.8345 | 0.8293 | 0.8423 | 0.8358 |
Framework versions
- Transformers 4.48.1
- Pytorch 2.3.1
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 1
Model tree for Kyungjin-Kim/embc25_finetuned_30000_en_es
Base model
Kyungjin-Kim/mmc_roberta_500000_en_es