cellate2.0_epoch50-tapt_base-LR_5e-05
This model is a fine-tuned version of microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.4333
- Accuracy: 0.7092
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 3407
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-06 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 50
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 1.3889 | 1.0 | 4 | 1.3281 | 0.7334 |
| 1.4376 | 2.0 | 8 | 1.2308 | 0.7499 |
| 1.3338 | 3.0 | 12 | 1.2157 | 0.7346 |
| 1.3264 | 4.0 | 16 | 1.3433 | 0.7153 |
| 1.2952 | 5.0 | 20 | 1.2775 | 0.7218 |
| 1.2784 | 6.0 | 24 | 1.2850 | 0.7232 |
| 1.2179 | 7.0 | 28 | 1.2227 | 0.7433 |
| 1.1625 | 8.0 | 32 | 1.3195 | 0.7340 |
| 1.1926 | 9.0 | 36 | 1.2699 | 0.7341 |
| 1.1811 | 10.0 | 40 | 1.2405 | 0.7405 |
| 1.1412 | 11.0 | 44 | 1.2696 | 0.7353 |
| 1.1551 | 12.0 | 48 | 1.3050 | 0.7211 |
| 1.0894 | 13.0 | 52 | 1.2682 | 0.7238 |
| 1.1266 | 14.0 | 56 | 1.2505 | 0.7317 |
| 1.0564 | 15.0 | 60 | 1.3044 | 0.7369 |
| 1.0411 | 16.0 | 64 | 1.2263 | 0.7327 |
| 0.9432 | 17.0 | 68 | 1.2190 | 0.7371 |
| 0.9654 | 18.0 | 72 | 1.3135 | 0.7290 |
| 1.0505 | 19.0 | 76 | 1.3215 | 0.7347 |
| 0.9939 | 20.0 | 80 | 1.1755 | 0.7373 |
| 0.9733 | 21.0 | 84 | 1.3522 | 0.7163 |
| 0.9815 | 22.0 | 88 | 1.2843 | 0.7245 |
| 1.0134 | 23.0 | 92 | 1.3200 | 0.7224 |
| 0.947 | 24.0 | 96 | 1.3056 | 0.7132 |
| 0.9595 | 25.0 | 100 | 1.3061 | 0.7312 |
| 0.9537 | 26.0 | 104 | 1.3493 | 0.7229 |
| 0.9314 | 27.0 | 108 | 1.3301 | 0.7273 |
| 0.9157 | 28.0 | 112 | 1.2704 | 0.7294 |
| 0.873 | 29.0 | 116 | 1.3250 | 0.7182 |
| 0.9179 | 30.0 | 120 | 1.3470 | 0.7100 |
| 0.928 | 31.0 | 124 | 1.3172 | 0.7281 |
| 0.8795 | 32.0 | 128 | 1.3178 | 0.7176 |
| 0.9012 | 33.0 | 132 | 1.3029 | 0.7217 |
| 0.8892 | 34.0 | 136 | 1.3133 | 0.7246 |
| 0.8002 | 35.0 | 140 | 1.3613 | 0.7138 |
| 0.92 | 36.0 | 144 | 1.4253 | 0.7084 |
| 0.8269 | 37.0 | 148 | 1.3419 | 0.7168 |
| 0.8169 | 38.0 | 152 | 1.4301 | 0.7143 |
| 0.8665 | 39.0 | 156 | 1.3304 | 0.7202 |
| 0.8762 | 40.0 | 160 | 1.3398 | 0.7188 |
| 0.8101 | 41.0 | 164 | 1.2547 | 0.7264 |
| 0.7907 | 42.0 | 168 | 1.3545 | 0.7158 |
| 0.7832 | 43.0 | 172 | 1.3770 | 0.7096 |
| 0.8418 | 44.0 | 176 | 1.2982 | 0.7153 |
| 0.83 | 45.0 | 180 | 1.2395 | 0.7340 |
| 0.865 | 46.0 | 184 | 1.3671 | 0.7219 |
| 0.8728 | 47.0 | 188 | 1.3113 | 0.7263 |
| 0.8318 | 48.0 | 192 | 1.4179 | 0.7199 |
| 0.8659 | 49.0 | 196 | 1.3168 | 0.7286 |
| 0.8022 | 50.0 | 200 | 1.4333 | 0.7092 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.2
- Tokenizers 0.21.0
- Downloads last month
- 2