CeLLaTe3.0_with_vague_adapted_pubmed_bert
This model is a fine-tuned version of Mardiyyah/cellate1.0-tapt_freeze_llrd_ww_mask-LR_2e-05 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1883
- Precision: 0.7819
- Recall: 0.8316
- F1: 0.8060
- Accuracy: 0.9673
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 3407
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 1.0049 | 1.0 | 202 | 0.2790 | 0.4807 | 0.4178 | 0.4470 | 0.9292 |
| 0.2072 | 2.0 | 404 | 0.1581 | 0.6347 | 0.8190 | 0.7152 | 0.9579 |
| 0.1088 | 3.0 | 606 | 0.1444 | 0.6869 | 0.7637 | 0.7233 | 0.9599 |
| 0.0717 | 4.0 | 808 | 0.1373 | 0.7083 | 0.7955 | 0.7493 | 0.9645 |
| 0.0495 | 5.0 | 1010 | 0.1630 | 0.7624 | 0.8042 | 0.7827 | 0.9658 |
| 0.036 | 6.0 | 1212 | 0.1644 | 0.7579 | 0.8312 | 0.7929 | 0.9658 |
| 0.0265 | 7.0 | 1414 | 0.1893 | 0.7819 | 0.8316 | 0.8060 | 0.9673 |
| 0.02 | 8.0 | 1616 | 0.2116 | 0.7381 | 0.8277 | 0.7803 | 0.9637 |
| 0.0158 | 9.0 | 1818 | 0.2020 | 0.7583 | 0.8246 | 0.7901 | 0.9666 |
| 0.0134 | 10.0 | 2020 | 0.2256 | 0.7283 | 0.7955 | 0.7604 | 0.9638 |
| 0.0103 | 11.0 | 2222 | 0.2269 | 0.7635 | 0.8290 | 0.7949 | 0.9663 |
| 0.0083 | 12.0 | 2424 | 0.2436 | 0.7576 | 0.8133 | 0.7845 | 0.9655 |
| 0.007 | 13.0 | 2626 | 0.2523 | 0.7593 | 0.8181 | 0.7876 | 0.9654 |
| 0.0059 | 14.0 | 2828 | 0.2478 | 0.7544 | 0.8259 | 0.7885 | 0.9657 |
| 0.005 | 15.0 | 3030 | 0.2439 | 0.7573 | 0.8107 | 0.7831 | 0.9662 |
| 0.0044 | 16.0 | 3232 | 0.2628 | 0.7586 | 0.8138 | 0.7852 | 0.9655 |
| 0.0042 | 17.0 | 3434 | 0.2458 | 0.7730 | 0.8151 | 0.7935 | 0.9668 |
| 0.0039 | 18.0 | 3636 | 0.2644 | 0.7589 | 0.8151 | 0.7860 | 0.9662 |
| 0.0029 | 19.0 | 3838 | 0.2629 | 0.7667 | 0.8081 | 0.7869 | 0.9659 |
| 0.0029 | 20.0 | 4040 | 0.2603 | 0.7596 | 0.8124 | 0.7851 | 0.9661 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.2
- Tokenizers 0.21.0
- Downloads last month
- 3