populism_classifier_bsample_050
This model is a fine-tuned version of google-bert/bert-base-multilingual-uncased on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6834
- Accuracy: 0.8230
- 1-f1: 0.3857
- 1-recall: 0.9643
- 1-precision: 0.2411
- Balanced Acc: 0.8893
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
| Training Loss |
Epoch |
Step |
Validation Loss |
Accuracy |
1-f1 |
1-recall |
1-precision |
Balanced Acc |
| 0.0371 |
1.0 |
7 |
1.0019 |
0.7058 |
0.2741 |
0.9643 |
0.1598 |
0.8271 |
| 0.8511 |
2.0 |
14 |
0.5097 |
0.9095 |
0.5217 |
0.8571 |
0.375 |
0.8849 |
| 0.0297 |
3.0 |
21 |
0.7293 |
0.7984 |
0.3553 |
0.9643 |
0.2177 |
0.8762 |
| 0.0336 |
4.0 |
28 |
0.6834 |
0.8230 |
0.3857 |
0.9643 |
0.2411 |
0.8893 |
Framework versions
- Transformers 4.46.3
- Pytorch 2.4.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3