roberta-large-ner-ghtk-cs-add-2label-10-new-data-3090-14Sep-1
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2814
- Tk: {'precision': 0.8181818181818182, 'recall': 0.6982758620689655, 'f1': 0.7534883720930233, 'number': 116}
- A: {'precision': 0.92, 'recall': 0.9605568445475638, 'f1': 0.9398410896708287, 'number': 431}
- Gày: {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34}
- Gày trừu tượng: {'precision': 0.8959183673469387, 'recall': 0.8995901639344263, 'f1': 0.8977505112474438, 'number': 488}
- Iờ: {'precision': 0.6170212765957447, 'recall': 0.7631578947368421, 'f1': 0.6823529411764706, 'number': 38}
- Ã đơn: {'precision': 0.8542713567839196, 'recall': 0.8374384236453202, 'f1': 0.845771144278607, 'number': 203}
- Đt: {'precision': 0.9354838709677419, 'recall': 0.9908883826879271, 'f1': 0.9623893805309734, 'number': 878}
- Đt trừu tượng: {'precision': 0.7683823529411765, 'recall': 0.8969957081545065, 'f1': 0.8277227722772277, 'number': 233}
- Overall Precision: 0.8866
- Overall Recall: 0.9265
- Overall F1: 0.9061
- Overall Accuracy: 0.9580
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Tk | A | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 274 | 0.1978 | {'precision': 0.78, 'recall': 0.6724137931034483, 'f1': 0.7222222222222223, 'number': 116} | {'precision': 0.9051724137931034, 'recall': 0.974477958236659, 'f1': 0.9385474860335196, 'number': 431} | {'precision': 0.6808510638297872, 'recall': 0.9411764705882353, 'f1': 0.7901234567901235, 'number': 34} | {'precision': 0.8621359223300971, 'recall': 0.9098360655737705, 'f1': 0.8853439680957128, 'number': 488} | {'precision': 0.5833333333333334, 'recall': 0.9210526315789473, 'f1': 0.7142857142857143, 'number': 38} | {'precision': 0.8813559322033898, 'recall': 0.7684729064039408, 'f1': 0.8210526315789475, 'number': 203} | {'precision': 0.9145833333333333, 'recall': 1.0, 'f1': 0.955386289445049, 'number': 878} | {'precision': 0.6537313432835821, 'recall': 0.9399141630901288, 'f1': 0.7711267605633803, 'number': 233} | 0.8510 | 0.9343 | 0.8907 | 0.9535 |
| 0.0773 | 2.0 | 548 | 0.1664 | {'precision': 0.8695652173913043, 'recall': 0.6896551724137931, 'f1': 0.7692307692307693, 'number': 116} | {'precision': 0.9157667386609071, 'recall': 0.9837587006960556, 'f1': 0.9485458612975392, 'number': 431} | {'precision': 0.6744186046511628, 'recall': 0.8529411764705882, 'f1': 0.7532467532467532, 'number': 34} | {'precision': 0.8725490196078431, 'recall': 0.9118852459016393, 'f1': 0.8917835671342685, 'number': 488} | {'precision': 0.6170212765957447, 'recall': 0.7631578947368421, 'f1': 0.6823529411764706, 'number': 38} | {'precision': 0.7772727272727272, 'recall': 0.8423645320197044, 'f1': 0.8085106382978724, 'number': 203} | {'precision': 0.9166666666666666, 'recall': 0.989749430523918, 'f1': 0.9518072289156626, 'number': 878} | {'precision': 0.7482014388489209, 'recall': 0.8927038626609443, 'f1': 0.8140900195694717, 'number': 233} | 0.8670 | 0.9314 | 0.8980 | 0.9549 |
| 0.0773 | 3.0 | 822 | 0.2053 | {'precision': 0.7087378640776699, 'recall': 0.6293103448275862, 'f1': 0.6666666666666667, 'number': 116} | {'precision': 0.9298642533936652, 'recall': 0.9535962877030162, 'f1': 0.9415807560137456, 'number': 431} | {'precision': 0.7575757575757576, 'recall': 0.7352941176470589, 'f1': 0.746268656716418, 'number': 34} | {'precision': 0.8693069306930693, 'recall': 0.8995901639344263, 'f1': 0.8841893252769385, 'number': 488} | {'precision': 0.6285714285714286, 'recall': 0.5789473684210527, 'f1': 0.6027397260273972, 'number': 38} | {'precision': 0.7837837837837838, 'recall': 0.8571428571428571, 'f1': 0.8188235294117646, 'number': 203} | {'precision': 0.9345493562231759, 'recall': 0.9920273348519362, 'f1': 0.9624309392265193, 'number': 878} | {'precision': 0.75, 'recall': 0.8884120171673819, 'f1': 0.8133595284872298, 'number': 233} | 0.8721 | 0.9178 | 0.8943 | 0.9548 |
| 0.0382 | 4.0 | 1096 | 0.1953 | {'precision': 0.7384615384615385, 'recall': 0.8275862068965517, 'f1': 0.7804878048780489, 'number': 116} | {'precision': 0.9258426966292135, 'recall': 0.9559164733178654, 'f1': 0.9406392694063928, 'number': 431} | {'precision': 0.6956521739130435, 'recall': 0.9411764705882353, 'f1': 0.7999999999999999, 'number': 34} | {'precision': 0.8742632612966601, 'recall': 0.9118852459016393, 'f1': 0.8926780341023068, 'number': 488} | {'precision': 0.5769230769230769, 'recall': 0.7894736842105263, 'f1': 0.6666666666666666, 'number': 38} | {'precision': 0.8907103825136612, 'recall': 0.8029556650246306, 'f1': 0.844559585492228, 'number': 203} | {'precision': 0.940152339499456, 'recall': 0.9840546697038725, 'f1': 0.9616026711185309, 'number': 878} | {'precision': 0.8237704918032787, 'recall': 0.8626609442060086, 'f1': 0.8427672955974843, 'number': 233} | 0.8873 | 0.9265 | 0.9064 | 0.9598 |
| 0.0382 | 5.0 | 1370 | 0.2185 | {'precision': 0.7666666666666667, 'recall': 0.7931034482758621, 'f1': 0.7796610169491527, 'number': 116} | {'precision': 0.9230769230769231, 'recall': 0.974477958236659, 'f1': 0.9480812641083521, 'number': 431} | {'precision': 0.6956521739130435, 'recall': 0.9411764705882353, 'f1': 0.7999999999999999, 'number': 34} | {'precision': 0.8928571428571429, 'recall': 0.9221311475409836, 'f1': 0.9072580645161291, 'number': 488} | {'precision': 0.5454545454545454, 'recall': 0.7894736842105263, 'f1': 0.6451612903225806, 'number': 38} | {'precision': 0.8901098901098901, 'recall': 0.7980295566502463, 'f1': 0.8415584415584415, 'number': 203} | {'precision': 0.9613259668508287, 'recall': 0.9908883826879271, 'f1': 0.975883342680875, 'number': 878} | {'precision': 0.7491039426523297, 'recall': 0.8969957081545065, 'f1': 0.8164062499999999, 'number': 233} | 0.8896 | 0.9356 | 0.9120 | 0.9602 |
| 0.0235 | 6.0 | 1644 | 0.2448 | {'precision': 0.85, 'recall': 0.7327586206896551, 'f1': 0.787037037037037, 'number': 116} | {'precision': 0.9116379310344828, 'recall': 0.9814385150812065, 'f1': 0.9452513966480447, 'number': 431} | {'precision': 0.7073170731707317, 'recall': 0.8529411764705882, 'f1': 0.7733333333333334, 'number': 34} | {'precision': 0.897119341563786, 'recall': 0.8934426229508197, 'f1': 0.8952772073921971, 'number': 488} | {'precision': 0.6078431372549019, 'recall': 0.8157894736842105, 'f1': 0.6966292134831461, 'number': 38} | {'precision': 0.8260869565217391, 'recall': 0.8423645320197044, 'f1': 0.8341463414634146, 'number': 203} | {'precision': 0.9382448537378115, 'recall': 0.9863325740318907, 'f1': 0.9616879511382567, 'number': 878} | {'precision': 0.7832699619771863, 'recall': 0.8841201716738197, 'f1': 0.8306451612903226, 'number': 233} | 0.8864 | 0.9281 | 0.9068 | 0.9570 |
| 0.0235 | 7.0 | 1918 | 0.2768 | {'precision': 0.8877551020408163, 'recall': 0.75, 'f1': 0.8130841121495328, 'number': 116} | {'precision': 0.9229074889867841, 'recall': 0.9721577726218097, 'f1': 0.9468926553672317, 'number': 431} | {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34} | {'precision': 0.8862275449101796, 'recall': 0.9098360655737705, 'f1': 0.8978766430738119, 'number': 488} | {'precision': 0.6486486486486487, 'recall': 0.631578947368421, 'f1': 0.64, 'number': 38} | {'precision': 0.9028571428571428, 'recall': 0.7783251231527094, 'f1': 0.835978835978836, 'number': 203} | {'precision': 0.9564245810055866, 'recall': 0.9749430523917996, 'f1': 0.9655950366610265, 'number': 878} | {'precision': 0.819672131147541, 'recall': 0.8583690987124464, 'f1': 0.8385744234800838, 'number': 233} | 0.9076 | 0.9166 | 0.9120 | 0.9602 |
| 0.0134 | 8.0 | 2192 | 0.2686 | {'precision': 0.801980198019802, 'recall': 0.6982758620689655, 'f1': 0.7465437788018433, 'number': 116} | {'precision': 0.9263392857142857, 'recall': 0.962877030162413, 'f1': 0.944254835039818, 'number': 431} | {'precision': 0.7619047619047619, 'recall': 0.9411764705882353, 'f1': 0.8421052631578947, 'number': 34} | {'precision': 0.9088983050847458, 'recall': 0.8790983606557377, 'f1': 0.89375, 'number': 488} | {'precision': 0.6530612244897959, 'recall': 0.8421052631578947, 'f1': 0.735632183908046, 'number': 38} | {'precision': 0.8736842105263158, 'recall': 0.8177339901477833, 'f1': 0.8447837150127226, 'number': 203} | {'precision': 0.9314775160599572, 'recall': 0.9908883826879271, 'f1': 0.9602649006622518, 'number': 878} | {'precision': 0.796812749003984, 'recall': 0.8583690987124464, 'f1': 0.8264462809917356, 'number': 233} | 0.8947 | 0.9190 | 0.9067 | 0.9591 |
| 0.0134 | 9.0 | 2466 | 0.2790 | {'precision': 0.8181818181818182, 'recall': 0.6982758620689655, 'f1': 0.7534883720930233, 'number': 116} | {'precision': 0.9301801801801802, 'recall': 0.9582366589327146, 'f1': 0.9440000000000001, 'number': 431} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.9037656903765691, 'recall': 0.8852459016393442, 'f1': 0.8944099378881988, 'number': 488} | {'precision': 0.6444444444444445, 'recall': 0.7631578947368421, 'f1': 0.6987951807228916, 'number': 38} | {'precision': 0.8535353535353535, 'recall': 0.8325123152709359, 'f1': 0.8428927680798005, 'number': 203} | {'precision': 0.9395248380129589, 'recall': 0.9908883826879271, 'f1': 0.9645232815964523, 'number': 878} | {'precision': 0.7675276752767528, 'recall': 0.8927038626609443, 'f1': 0.8253968253968255, 'number': 233} | 0.8918 | 0.9223 | 0.9068 | 0.9584 |
| 0.006 | 10.0 | 2740 | 0.2814 | {'precision': 0.8181818181818182, 'recall': 0.6982758620689655, 'f1': 0.7534883720930233, 'number': 116} | {'precision': 0.92, 'recall': 0.9605568445475638, 'f1': 0.9398410896708287, 'number': 431} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.8959183673469387, 'recall': 0.8995901639344263, 'f1': 0.8977505112474438, 'number': 488} | {'precision': 0.6170212765957447, 'recall': 0.7631578947368421, 'f1': 0.6823529411764706, 'number': 38} | {'precision': 0.8542713567839196, 'recall': 0.8374384236453202, 'f1': 0.845771144278607, 'number': 203} | {'precision': 0.9354838709677419, 'recall': 0.9908883826879271, 'f1': 0.9623893805309734, 'number': 878} | {'precision': 0.7683823529411765, 'recall': 0.8969957081545065, 'f1': 0.8277227722772277, 'number': 233} | 0.8866 | 0.9265 | 0.9061 | 0.9580 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support