indic-bert-v2-mlm-only-dra-tam-mal-aw-classification-lora
This model is a fine-tuned version of ai4bharat/IndicBERTv2-MLM-only on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5697
- Accuracy: 0.7286
- F1: 0.7161
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|---|---|---|---|---|---|
| 0.692 | 0.2222 | 20 | 0.6852 | 0.5998 | 0.3808 |
| 0.6858 | 0.4444 | 40 | 0.6873 | 0.5110 | 0.6503 |
| 0.6875 | 0.6667 | 60 | 0.6765 | 0.6414 | 0.5501 |
| 0.6771 | 0.8889 | 80 | 0.6787 | 0.5403 | 0.6606 |
| 0.6754 | 1.1111 | 100 | 0.6652 | 0.6601 | 0.5341 |
| 0.6687 | 1.3333 | 120 | 0.6604 | 0.6919 | 0.6625 |
| 0.6625 | 1.5556 | 140 | 0.6579 | 0.6585 | 0.6926 |
| 0.6577 | 1.7778 | 160 | 0.6494 | 0.6764 | 0.6958 |
| 0.652 | 2.0 | 180 | 0.6402 | 0.6903 | 0.6945 |
| 0.6359 | 2.2222 | 200 | 0.6366 | 0.6626 | 0.7030 |
| 0.6416 | 2.4444 | 220 | 0.6188 | 0.7148 | 0.6841 |
| 0.637 | 2.6667 | 240 | 0.6122 | 0.7123 | 0.7075 |
| 0.6287 | 2.8889 | 260 | 0.6054 | 0.7115 | 0.7141 |
| 0.6177 | 3.1111 | 280 | 0.5991 | 0.7205 | 0.7120 |
| 0.6045 | 3.3333 | 300 | 0.5937 | 0.7188 | 0.7013 |
| 0.6021 | 3.5556 | 320 | 0.5917 | 0.7172 | 0.7091 |
| 0.6022 | 3.7778 | 340 | 0.5877 | 0.7139 | 0.6829 |
| 0.614 | 4.0 | 360 | 0.5881 | 0.7164 | 0.7220 |
| 0.617 | 4.2222 | 380 | 0.5824 | 0.7196 | 0.7119 |
| 0.5753 | 4.4444 | 400 | 0.5794 | 0.7172 | 0.7120 |
| 0.5698 | 4.6667 | 420 | 0.5818 | 0.7180 | 0.7267 |
| 0.5929 | 4.8889 | 440 | 0.5747 | 0.7196 | 0.7133 |
| 0.5718 | 5.1111 | 460 | 0.5722 | 0.7278 | 0.7060 |
| 0.5878 | 5.3333 | 480 | 0.5715 | 0.7205 | 0.7115 |
| 0.5661 | 5.5556 | 500 | 0.5704 | 0.7262 | 0.7153 |
| 0.5982 | 5.7778 | 520 | 0.5695 | 0.7278 | 0.7136 |
| 0.588 | 6.0 | 540 | 0.5697 | 0.7286 | 0.7161 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.20.3
- Downloads last month
- 3
Model tree for livinNector/indic-bert-v2-mlm-only-dra-tam-mal-aw-classification-lora
Base model
ai4bharat/IndicBERTv2-MLM-only