phobert-v2-UIT-VSMEC-ep20
This model is a fine-tuned version of vinai/phobert-base-v2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.7445
- Micro F1: 31.1953
- Micro Precision: 31.1953
- Micro Recall: 31.1953
- Macro F1: 6.7937
- Macro Precision: 4.4565
- Macro Recall: 14.2857
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 20.0
Training results
| Training Loss | Epoch | Step | Validation Loss | Micro F1 | Micro Precision | Micro Recall | Macro F1 | Macro Precision | Macro Recall |
|---|---|---|---|---|---|---|---|---|---|
| 1.6508 | 1.0 | 87 | 1.6280 | 38.1924 | 38.1924 | 38.1924 | 13.6689 | 11.0672 | 19.7948 |
| 1.6617 | 2.0 | 174 | 1.6283 | 41.8367 | 41.8367 | 41.8367 | 18.9235 | 16.9137 | 23.0805 |
| 1.7109 | 3.0 | 261 | 1.7474 | 32.6531 | 32.6531 | 32.6531 | 10.8742 | 8.5221 | 17.8755 |
| 1.7516 | 4.0 | 348 | 1.7299 | 33.0904 | 33.0904 | 33.0904 | 9.5104 | 10.4580 | 15.7395 |
| 1.6395 | 5.0 | 435 | 1.6963 | 36.1516 | 36.1516 | 36.1516 | 12.0999 | 10.4591 | 18.0399 |
| 1.7109 | 6.0 | 522 | 1.6895 | 36.1516 | 36.1516 | 36.1516 | 12.1304 | 10.5941 | 18.0399 |
| 1.6211 | 7.0 | 609 | 1.6557 | 39.2128 | 39.2128 | 39.2128 | 14.3600 | 10.8292 | 21.3168 |
| 1.6352 | 8.0 | 696 | 1.6453 | 36.8805 | 36.8805 | 36.8805 | 14.3207 | 11.9547 | 21.6941 |
| 1.7352 | 9.0 | 783 | 1.7116 | 30.7580 | 30.7580 | 30.7580 | 11.9946 | 11.8060 | 19.2029 |
| 1.5836 | 10.0 | 870 | 1.6459 | 37.4636 | 37.4636 | 37.4636 | 14.4364 | 11.7582 | 21.8830 |
| 1.6883 | 11.0 | 957 | 1.6196 | 38.6297 | 38.6297 | 38.6297 | 14.8475 | 11.9901 | 22.4561 |
| 1.7379 | 12.0 | 1044 | 1.6837 | 38.4840 | 38.4840 | 38.4840 | 13.8374 | 10.6984 | 20.2799 |
| 1.6898 | 13.0 | 1131 | 1.6709 | 37.7551 | 37.7551 | 37.7551 | 13.4848 | 10.5169 | 19.7508 |
| 1.8129 | 14.0 | 1218 | 1.7437 | 32.0700 | 32.0700 | 32.0700 | 8.2124 | 12.2136 | 14.9597 |
| 1.702 | 15.0 | 1305 | 1.7436 | 31.1953 | 31.1953 | 31.1953 | 6.7937 | 4.4565 | 14.2857 |
| 1.773 | 16.0 | 1392 | 1.7405 | 31.1953 | 31.1953 | 31.1953 | 6.7937 | 4.4565 | 14.2857 |
| 1.8234 | 17.0 | 1479 | 1.7439 | 31.1953 | 31.1953 | 31.1953 | 6.7937 | 4.4565 | 14.2857 |
| 1.7848 | 18.0 | 1566 | 1.7449 | 31.1953 | 31.1953 | 31.1953 | 6.7937 | 4.4565 | 14.2857 |
| 1.8039 | 19.0 | 1653 | 1.7440 | 31.1953 | 31.1953 | 31.1953 | 6.7937 | 4.4565 | 14.2857 |
| 1.777 | 19.7723 | 1720 | 1.7445 | 31.1953 | 31.1953 | 31.1953 | 6.7937 | 4.4565 | 14.2857 |
Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 2.15.0
- Tokenizers 0.21.1
- Downloads last month
- 1
Model tree for datht/phobert-v2-UIT-VSMEC-ep20
Base model
vinai/phobert-base-v2