CeLLaTe3.0_with_vague_pubmed_bert_with_gaz
This model is a fine-tuned version of microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2848
- Precision: 0.7907
- Recall: 0.8418
- F1: 0.8154
- Accuracy: 0.9648
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 3407
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.7289 | 1.0 | 403 | 0.2346 | 0.4923 | 0.6770 | 0.5700 | 0.9374 |
| 0.1461 | 2.0 | 806 | 0.1511 | 0.7226 | 0.7618 | 0.7417 | 0.9591 |
| 0.0786 | 3.0 | 1209 | 0.1685 | 0.7478 | 0.7896 | 0.7681 | 0.9606 |
| 0.0444 | 4.0 | 1612 | 0.1856 | 0.7923 | 0.8182 | 0.8050 | 0.9641 |
| 0.028 | 5.0 | 2015 | 0.2203 | 0.7529 | 0.7830 | 0.7677 | 0.9604 |
| 0.017 | 6.0 | 2418 | 0.2399 | 0.7545 | 0.7962 | 0.7748 | 0.9620 |
| 0.0118 | 7.0 | 2821 | 0.2602 | 0.7611 | 0.7921 | 0.7763 | 0.9617 |
| 0.0087 | 8.0 | 3224 | 0.2681 | 0.7722 | 0.8182 | 0.7945 | 0.9621 |
| 0.0062 | 9.0 | 3627 | 0.2789 | 0.7820 | 0.8296 | 0.8051 | 0.9641 |
| 0.0047 | 10.0 | 4030 | 0.2880 | 0.7632 | 0.8272 | 0.7939 | 0.9622 |
| 0.0034 | 11.0 | 4433 | 0.3139 | 0.7468 | 0.7639 | 0.7552 | 0.9597 |
| 0.003 | 12.0 | 4836 | 0.2834 | 0.7907 | 0.8418 | 0.8154 | 0.9648 |
| 0.002 | 13.0 | 5239 | 0.2971 | 0.7816 | 0.8213 | 0.8009 | 0.9636 |
| 0.0017 | 14.0 | 5642 | 0.3275 | 0.7582 | 0.7761 | 0.7670 | 0.9607 |
| 0.0017 | 15.0 | 6045 | 0.3233 | 0.7761 | 0.8161 | 0.7956 | 0.9622 |
| 0.0014 | 16.0 | 6448 | 0.3258 | 0.7696 | 0.7955 | 0.7824 | 0.9619 |
| 0.001 | 17.0 | 6851 | 0.3229 | 0.7859 | 0.8310 | 0.8078 | 0.9636 |
| 0.0007 | 18.0 | 7254 | 0.3228 | 0.7930 | 0.8248 | 0.8086 | 0.9634 |
| 0.0007 | 19.0 | 7657 | 0.3268 | 0.7887 | 0.8150 | 0.8016 | 0.9630 |
| 0.0005 | 20.0 | 8060 | 0.3315 | 0.7828 | 0.8157 | 0.7989 | 0.9626 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.2
- Tokenizers 0.21.0
- Downloads last month
- 1