roberta-large-ner-ghtk-cs-add-label-new-data-3090-21Aug-1
This model is a fine-tuned version of Kudod/roberta-large-ner-ghtk-cs-6-label-old-data-3090-15Aug-2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2759
- Tk: {'precision': 0.9726027397260274, 'recall': 0.6120689655172413, 'f1': 0.7513227513227513, 'number': 116}
- Gày: {'precision': 0.574468085106383, 'recall': 0.8181818181818182, 'f1': 0.675, 'number': 33}
- Gày trừu tượng: {'precision': 0.910913140311804, 'recall': 0.8758029978586723, 'f1': 0.8930131004366811, 'number': 467}
- Iờ: {'precision': 0.5373134328358209, 'recall': 0.9473684210526315, 'f1': 0.6857142857142856, 'number': 38}
- Ã đơn: {'precision': 0.7924528301886793, 'recall': 0.8442211055276382, 'f1': 0.8175182481751826, 'number': 199}
- Đt: {'precision': 0.9151832460732985, 'recall': 0.9954441913439636, 'f1': 0.9536279323513366, 'number': 878}
- Đt trừu tượng: {'precision': 0.7350746268656716, 'recall': 0.9205607476635514, 'f1': 0.8174273858921162, 'number': 214}
- Overall Precision: 0.8605
- Overall Recall: 0.9162
- Overall F1: 0.8875
- Overall Accuracy: 0.9587
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Tk | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 48 | 0.1764 | {'precision': 1.0, 'recall': 0.5862068965517241, 'f1': 0.7391304347826086, 'number': 116} | {'precision': 0.6078431372549019, 'recall': 0.9393939393939394, 'f1': 0.738095238095238, 'number': 33} | {'precision': 0.9175946547884187, 'recall': 0.8822269807280514, 'f1': 0.8995633187772925, 'number': 467} | {'precision': 0.546875, 'recall': 0.9210526315789473, 'f1': 0.6862745098039216, 'number': 38} | {'precision': 0.8645833333333334, 'recall': 0.8341708542713567, 'f1': 0.8491048593350384, 'number': 199} | {'precision': 0.9194061505832449, 'recall': 0.9874715261958997, 'f1': 0.9522240527182866, 'number': 878} | {'precision': 0.6613418530351438, 'recall': 0.9672897196261683, 'f1': 0.7855787476280836, 'number': 214} | 0.8587 | 0.9183 | 0.8875 | 0.9605 |
| No log | 2.0 | 96 | 0.2257 | {'precision': 1.0, 'recall': 0.5862068965517241, 'f1': 0.7391304347826086, 'number': 116} | {'precision': 0.5957446808510638, 'recall': 0.8484848484848485, 'f1': 0.7, 'number': 33} | {'precision': 0.8493975903614458, 'recall': 0.9057815845824411, 'f1': 0.8766839378238341, 'number': 467} | {'precision': 0.45, 'recall': 0.9473684210526315, 'f1': 0.6101694915254237, 'number': 38} | {'precision': 0.7935779816513762, 'recall': 0.8693467336683417, 'f1': 0.829736211031175, 'number': 199} | {'precision': 0.9106029106029107, 'recall': 0.9977220956719818, 'f1': 0.9521739130434783, 'number': 878} | {'precision': 0.6478405315614618, 'recall': 0.9112149532710281, 'f1': 0.7572815533980582, 'number': 214} | 0.8275 | 0.9249 | 0.8735 | 0.9542 |
| No log | 3.0 | 144 | 0.1888 | {'precision': 0.9775280898876404, 'recall': 0.75, 'f1': 0.848780487804878, 'number': 116} | {'precision': 0.56, 'recall': 0.8484848484848485, 'f1': 0.6746987951807228, 'number': 33} | {'precision': 0.8986784140969163, 'recall': 0.8736616702355461, 'f1': 0.8859934853420196, 'number': 467} | {'precision': 0.5373134328358209, 'recall': 0.9473684210526315, 'f1': 0.6857142857142856, 'number': 38} | {'precision': 0.8390243902439024, 'recall': 0.864321608040201, 'f1': 0.8514851485148514, 'number': 199} | {'precision': 0.9314775160599572, 'recall': 0.9908883826879271, 'f1': 0.9602649006622518, 'number': 878} | {'precision': 0.7310606060606061, 'recall': 0.9018691588785047, 'f1': 0.8075313807531381, 'number': 214} | 0.8696 | 0.9224 | 0.8952 | 0.9611 |
| No log | 4.0 | 192 | 0.2152 | {'precision': 0.9722222222222222, 'recall': 0.603448275862069, 'f1': 0.7446808510638298, 'number': 116} | {'precision': 0.5434782608695652, 'recall': 0.7575757575757576, 'f1': 0.6329113924050633, 'number': 33} | {'precision': 0.9039812646370023, 'recall': 0.8265524625267666, 'f1': 0.8635346756152125, 'number': 467} | {'precision': 0.5833333333333334, 'recall': 0.9210526315789473, 'f1': 0.7142857142857143, 'number': 38} | {'precision': 0.8472906403940886, 'recall': 0.864321608040201, 'f1': 0.8557213930348258, 'number': 199} | {'precision': 0.9343379978471474, 'recall': 0.9886104783599089, 'f1': 0.9607083563918096, 'number': 878} | {'precision': 0.8016877637130801, 'recall': 0.8878504672897196, 'f1': 0.8425720620842572, 'number': 214} | 0.8845 | 0.8977 | 0.8910 | 0.9618 |
| No log | 5.0 | 240 | 0.2340 | {'precision': 0.9659090909090909, 'recall': 0.7327586206896551, 'f1': 0.8333333333333334, 'number': 116} | {'precision': 0.5576923076923077, 'recall': 0.8787878787878788, 'f1': 0.6823529411764707, 'number': 33} | {'precision': 0.909297052154195, 'recall': 0.8586723768736617, 'f1': 0.8832599118942732, 'number': 467} | {'precision': 0.5538461538461539, 'recall': 0.9473684210526315, 'f1': 0.6990291262135921, 'number': 38} | {'precision': 0.8142857142857143, 'recall': 0.8592964824120602, 'f1': 0.8361858190709046, 'number': 199} | {'precision': 0.925531914893617, 'recall': 0.9908883826879271, 'f1': 0.957095709570957, 'number': 878} | {'precision': 0.7062937062937062, 'recall': 0.9439252336448598, 'f1': 0.8079999999999999, 'number': 214} | 0.8617 | 0.9224 | 0.8910 | 0.9584 |
| No log | 6.0 | 288 | 0.2668 | {'precision': 0.9767441860465116, 'recall': 0.7241379310344828, 'f1': 0.8316831683168318, 'number': 116} | {'precision': 0.5510204081632653, 'recall': 0.8181818181818182, 'f1': 0.6585365853658536, 'number': 33} | {'precision': 0.9341176470588235, 'recall': 0.8501070663811563, 'f1': 0.8901345291479821, 'number': 467} | {'precision': 0.5714285714285714, 'recall': 0.9473684210526315, 'f1': 0.7128712871287128, 'number': 38} | {'precision': 0.7857142857142857, 'recall': 0.8844221105527639, 'f1': 0.8321513002364066, 'number': 199} | {'precision': 0.917981072555205, 'recall': 0.9943052391799544, 'f1': 0.9546200109349371, 'number': 878} | {'precision': 0.9064039408866995, 'recall': 0.8598130841121495, 'f1': 0.8824940047961631, 'number': 214} | 0.8881 | 0.9136 | 0.9007 | 0.9606 |
| No log | 7.0 | 336 | 0.2606 | {'precision': 0.9333333333333333, 'recall': 0.7241379310344828, 'f1': 0.8155339805825242, 'number': 116} | {'precision': 0.5869565217391305, 'recall': 0.8181818181818182, 'f1': 0.6835443037974683, 'number': 33} | {'precision': 0.9121621621621622, 'recall': 0.867237687366167, 'f1': 0.8891328210757409, 'number': 467} | {'precision': 0.5714285714285714, 'recall': 0.9473684210526315, 'f1': 0.7128712871287128, 'number': 38} | {'precision': 0.8186813186813187, 'recall': 0.7487437185929648, 'f1': 0.7821522309711285, 'number': 199} | {'precision': 0.9219409282700421, 'recall': 0.9954441913439636, 'f1': 0.9572836801752465, 'number': 878} | {'precision': 0.718978102189781, 'recall': 0.9205607476635514, 'f1': 0.8073770491803277, 'number': 214} | 0.8657 | 0.9111 | 0.8878 | 0.9569 |
| No log | 8.0 | 384 | 0.2756 | {'precision': 0.9390243902439024, 'recall': 0.6637931034482759, 'f1': 0.7777777777777779, 'number': 116} | {'precision': 0.574468085106383, 'recall': 0.8181818181818182, 'f1': 0.675, 'number': 33} | {'precision': 0.8846960167714885, 'recall': 0.9036402569593148, 'f1': 0.8940677966101696, 'number': 467} | {'precision': 0.5217391304347826, 'recall': 0.9473684210526315, 'f1': 0.6728971962616822, 'number': 38} | {'precision': 0.7971698113207547, 'recall': 0.8492462311557789, 'f1': 0.8223844282238442, 'number': 199} | {'precision': 0.9161425576519916, 'recall': 0.9954441913439636, 'f1': 0.9541484716157205, 'number': 878} | {'precision': 0.6896551724137931, 'recall': 0.9345794392523364, 'f1': 0.7936507936507936, 'number': 214} | 0.8470 | 0.9280 | 0.8857 | 0.9582 |
| No log | 9.0 | 432 | 0.2734 | {'precision': 0.9726027397260274, 'recall': 0.6120689655172413, 'f1': 0.7513227513227513, 'number': 116} | {'precision': 0.5869565217391305, 'recall': 0.8181818181818182, 'f1': 0.6835443037974683, 'number': 33} | {'precision': 0.9070796460176991, 'recall': 0.8779443254817987, 'f1': 0.8922742110990206, 'number': 467} | {'precision': 0.5454545454545454, 'recall': 0.9473684210526315, 'f1': 0.6923076923076923, 'number': 38} | {'precision': 0.8038277511961722, 'recall': 0.8442211055276382, 'f1': 0.823529411764706, 'number': 199} | {'precision': 0.9151832460732985, 'recall': 0.9954441913439636, 'f1': 0.9536279323513366, 'number': 878} | {'precision': 0.7313432835820896, 'recall': 0.9158878504672897, 'f1': 0.8132780082987552, 'number': 214} | 0.8613 | 0.9162 | 0.8879 | 0.9587 |
| No log | 10.0 | 480 | 0.2759 | {'precision': 0.9726027397260274, 'recall': 0.6120689655172413, 'f1': 0.7513227513227513, 'number': 116} | {'precision': 0.574468085106383, 'recall': 0.8181818181818182, 'f1': 0.675, 'number': 33} | {'precision': 0.910913140311804, 'recall': 0.8758029978586723, 'f1': 0.8930131004366811, 'number': 467} | {'precision': 0.5373134328358209, 'recall': 0.9473684210526315, 'f1': 0.6857142857142856, 'number': 38} | {'precision': 0.7924528301886793, 'recall': 0.8442211055276382, 'f1': 0.8175182481751826, 'number': 199} | {'precision': 0.9151832460732985, 'recall': 0.9954441913439636, 'f1': 0.9536279323513366, 'number': 878} | {'precision': 0.7350746268656716, 'recall': 0.9205607476635514, 'f1': 0.8174273858921162, 'number': 214} | 0.8605 | 0.9162 | 0.8875 | 0.9587 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for Kudod/roberta-large-ner-ghtk-cs-add-label-new-data-3090-21Aug-1
Base model
FacebookAI/xlm-roberta-large