roberta-large-ner-ghtk-cs-add-label-600-new-data-3090-30Aug-1
This model is a fine-tuned version of Kudod/roberta-large-ner-ghtk-cs-6-label-old-data-3090-15Aug-2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2413
- Tk: {'precision': 1.0, 'recall': 0.6982758620689655, 'f1': 0.8223350253807107, 'number': 116}
- Gày: {'precision': 0.75, 'recall': 0.8823529411764706, 'f1': 0.8108108108108107, 'number': 34}
- Gày trừu tượng: {'precision': 0.9148073022312373, 'recall': 0.9241803278688525, 'f1': 0.9194699286442405, 'number': 488}
- Iờ: {'precision': 0.7894736842105263, 'recall': 0.7894736842105263, 'f1': 0.7894736842105263, 'number': 38}
- Ã đơn: {'precision': 0.8620689655172413, 'recall': 0.8620689655172413, 'f1': 0.8620689655172413, 'number': 203}
- Đt: {'precision': 0.924548352816153, 'recall': 0.9908883826879271, 'f1': 0.9565695437053326, 'number': 878}
- Đt trừu tượng: {'precision': 0.8232931726907631, 'recall': 0.8798283261802575, 'f1': 0.8506224066390041, 'number': 233}
- Overall Precision: 0.9007
- Overall Recall: 0.9256
- Overall F1: 0.9130
- Overall Accuracy: 0.9667
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Tk | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.0934 | 1.0 | 511 | 0.1373 | {'precision': 0.7272727272727273, 'recall': 0.896551724137931, 'f1': 0.8030888030888031, 'number': 116} | {'precision': 0.8275862068965517, 'recall': 0.7058823529411765, 'f1': 0.7619047619047619, 'number': 34} | {'precision': 0.9373601789709173, 'recall': 0.8586065573770492, 'f1': 0.8962566844919786, 'number': 488} | {'precision': 0.5714285714285714, 'recall': 0.42105263157894735, 'f1': 0.48484848484848486, 'number': 38} | {'precision': 0.8789473684210526, 'recall': 0.8226600985221675, 'f1': 0.8498727735368956, 'number': 203} | {'precision': 0.9474835886214442, 'recall': 0.9863325740318907, 'f1': 0.9665178571428571, 'number': 878} | {'precision': 0.7661290322580645, 'recall': 0.8154506437768241, 'f1': 0.7900207900207901, 'number': 233} | 0.8934 | 0.8975 | 0.8955 | 0.9608 |
| 0.0541 | 2.0 | 1022 | 0.1776 | {'precision': 0.8446601941747572, 'recall': 0.75, 'f1': 0.7945205479452053, 'number': 116} | {'precision': 0.6904761904761905, 'recall': 0.8529411764705882, 'f1': 0.7631578947368423, 'number': 34} | {'precision': 0.9151138716356108, 'recall': 0.9057377049180327, 'f1': 0.9104016477857878, 'number': 488} | {'precision': 0.625, 'recall': 0.7894736842105263, 'f1': 0.6976744186046512, 'number': 38} | {'precision': 0.794392523364486, 'recall': 0.8374384236453202, 'f1': 0.815347721822542, 'number': 203} | {'precision': 0.940854326396495, 'recall': 0.9783599088838268, 'f1': 0.9592406476828587, 'number': 878} | {'precision': 0.5835616438356165, 'recall': 0.9141630901287554, 'f1': 0.7123745819397994, 'number': 233} | 0.8441 | 0.9196 | 0.8802 | 0.9559 |
| 0.0451 | 3.0 | 1533 | 0.1978 | {'precision': 0.9830508474576272, 'recall': 0.5, 'f1': 0.6628571428571429, 'number': 116} | {'precision': 0.75, 'recall': 0.7941176470588235, 'f1': 0.7714285714285715, 'number': 34} | {'precision': 0.8512241054613936, 'recall': 0.9262295081967213, 'f1': 0.887144259077527, 'number': 488} | {'precision': 0.8333333333333334, 'recall': 0.39473684210526316, 'f1': 0.5357142857142857, 'number': 38} | {'precision': 0.8797814207650273, 'recall': 0.7931034482758621, 'f1': 0.8341968911917098, 'number': 203} | {'precision': 0.9219251336898395, 'recall': 0.9817767653758542, 'f1': 0.9509100937672366, 'number': 878} | {'precision': 0.8508771929824561, 'recall': 0.8326180257510729, 'f1': 0.8416485900216919, 'number': 233} | 0.8889 | 0.8889 | 0.8889 | 0.9605 |
| 0.0335 | 4.0 | 2044 | 0.1841 | {'precision': 0.9891304347826086, 'recall': 0.7844827586206896, 'f1': 0.8749999999999999, 'number': 116} | {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34} | {'precision': 0.9077868852459017, 'recall': 0.9077868852459017, 'f1': 0.9077868852459017, 'number': 488} | {'precision': 0.68, 'recall': 0.8947368421052632, 'f1': 0.7727272727272727, 'number': 38} | {'precision': 0.6987951807228916, 'recall': 0.8571428571428571, 'f1': 0.7699115044247788, 'number': 203} | {'precision': 0.932475884244373, 'recall': 0.9908883826879271, 'f1': 0.9607951408061844, 'number': 878} | {'precision': 0.73, 'recall': 0.9399141630901288, 'f1': 0.8217636022514071, 'number': 233} | 0.8645 | 0.9362 | 0.8989 | 0.9604 |
| 0.0237 | 5.0 | 2555 | 0.2069 | {'precision': 0.96, 'recall': 0.8275862068965517, 'f1': 0.888888888888889, 'number': 116} | {'precision': 0.7714285714285715, 'recall': 0.7941176470588235, 'f1': 0.782608695652174, 'number': 34} | {'precision': 0.883629191321499, 'recall': 0.9180327868852459, 'f1': 0.9005025125628141, 'number': 488} | {'precision': 0.625, 'recall': 0.7894736842105263, 'f1': 0.6976744186046512, 'number': 38} | {'precision': 0.8769230769230769, 'recall': 0.8423645320197044, 'f1': 0.8592964824120604, 'number': 203} | {'precision': 0.9385113268608414, 'recall': 0.9908883826879271, 'f1': 0.96398891966759, 'number': 878} | {'precision': 0.8984771573604061, 'recall': 0.759656652360515, 'f1': 0.8232558139534883, 'number': 233} | 0.9054 | 0.9141 | 0.9097 | 0.9514 |
| 0.0159 | 6.0 | 3066 | 0.1977 | {'precision': 0.9456521739130435, 'recall': 0.75, 'f1': 0.8365384615384616, 'number': 116} | {'precision': 0.7222222222222222, 'recall': 0.7647058823529411, 'f1': 0.7428571428571428, 'number': 34} | {'precision': 0.898989898989899, 'recall': 0.9118852459016393, 'f1': 0.9053916581892167, 'number': 488} | {'precision': 0.7941176470588235, 'recall': 0.7105263157894737, 'f1': 0.7499999999999999, 'number': 38} | {'precision': 0.8564356435643564, 'recall': 0.8522167487684729, 'f1': 0.854320987654321, 'number': 203} | {'precision': 0.9278131634819533, 'recall': 0.9954441913439636, 'f1': 0.9604395604395605, 'number': 878} | {'precision': 0.7789855072463768, 'recall': 0.9227467811158798, 'f1': 0.8447937131630648, 'number': 233} | 0.8893 | 0.9281 | 0.9083 | 0.9664 |
| 0.0124 | 7.0 | 3577 | 0.2122 | {'precision': 0.9876543209876543, 'recall': 0.6896551724137931, 'f1': 0.8121827411167513, 'number': 116} | {'precision': 0.6923076923076923, 'recall': 0.7941176470588235, 'f1': 0.7397260273972601, 'number': 34} | {'precision': 0.9219712525667351, 'recall': 0.9200819672131147, 'f1': 0.921025641025641, 'number': 488} | {'precision': 0.7352941176470589, 'recall': 0.6578947368421053, 'f1': 0.6944444444444445, 'number': 38} | {'precision': 0.8529411764705882, 'recall': 0.8571428571428571, 'f1': 0.855036855036855, 'number': 203} | {'precision': 0.9206349206349206, 'recall': 0.9908883826879271, 'f1': 0.9544706527701591, 'number': 878} | {'precision': 0.9150943396226415, 'recall': 0.8326180257510729, 'f1': 0.8719101123595506, 'number': 233} | 0.9086 | 0.9141 | 0.9113 | 0.9665 |
| 0.007 | 8.0 | 4088 | 0.2185 | {'precision': 1.0, 'recall': 0.75, 'f1': 0.8571428571428571, 'number': 116} | {'precision': 0.7368421052631579, 'recall': 0.8235294117647058, 'f1': 0.7777777777777778, 'number': 34} | {'precision': 0.9204081632653062, 'recall': 0.9241803278688525, 'f1': 0.9222903885480574, 'number': 488} | {'precision': 0.7837837837837838, 'recall': 0.7631578947368421, 'f1': 0.7733333333333334, 'number': 38} | {'precision': 0.8317307692307693, 'recall': 0.8522167487684729, 'f1': 0.8418491484184916, 'number': 203} | {'precision': 0.9235668789808917, 'recall': 0.9908883826879271, 'f1': 0.9560439560439561, 'number': 878} | {'precision': 0.8237704918032787, 'recall': 0.8626609442060086, 'f1': 0.8427672955974843, 'number': 233} | 0.8988 | 0.9241 | 0.9113 | 0.9660 |
| 0.0047 | 9.0 | 4599 | 0.2423 | {'precision': 1.0, 'recall': 0.7758620689655172, 'f1': 0.8737864077669902, 'number': 116} | {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34} | {'precision': 0.9149797570850202, 'recall': 0.9262295081967213, 'f1': 0.9205702647657841, 'number': 488} | {'precision': 0.75, 'recall': 0.7894736842105263, 'f1': 0.7692307692307692, 'number': 38} | {'precision': 0.821917808219178, 'recall': 0.8866995073891626, 'f1': 0.8530805687203791, 'number': 203} | {'precision': 0.932475884244373, 'recall': 0.9908883826879271, 'f1': 0.9607951408061844, 'number': 878} | {'precision': 0.8015564202334631, 'recall': 0.8841201716738197, 'f1': 0.8408163265306123, 'number': 233} | 0.8963 | 0.9342 | 0.9149 | 0.9652 |
| 0.0028 | 10.0 | 5110 | 0.2413 | {'precision': 1.0, 'recall': 0.6982758620689655, 'f1': 0.8223350253807107, 'number': 116} | {'precision': 0.75, 'recall': 0.8823529411764706, 'f1': 0.8108108108108107, 'number': 34} | {'precision': 0.9148073022312373, 'recall': 0.9241803278688525, 'f1': 0.9194699286442405, 'number': 488} | {'precision': 0.7894736842105263, 'recall': 0.7894736842105263, 'f1': 0.7894736842105263, 'number': 38} | {'precision': 0.8620689655172413, 'recall': 0.8620689655172413, 'f1': 0.8620689655172413, 'number': 203} | {'precision': 0.924548352816153, 'recall': 0.9908883826879271, 'f1': 0.9565695437053326, 'number': 878} | {'precision': 0.8232931726907631, 'recall': 0.8798283261802575, 'f1': 0.8506224066390041, 'number': 233} | 0.9007 | 0.9256 | 0.9130 | 0.9667 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 3
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for Kudod/roberta-large-ner-ghtk-cs-add-label-600-new-data-3090-30Aug-1
Base model
FacebookAI/xlm-roberta-large