roberta-large-ner-ghtk-cs-add-4label-20-new-data-3090-24Sep-1
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2822
- Tk: {'precision': 0.8020833333333334, 'recall': 0.6637931034482759, 'f1': 0.7264150943396228, 'number': 116}
- A: {'precision': 0.9369369369369369, 'recall': 0.9651972157772621, 'f1': 0.9508571428571428, 'number': 431}
- Gày: {'precision': 0.6956521739130435, 'recall': 0.9411764705882353, 'f1': 0.7999999999999999, 'number': 34}
- Gày trừu tượng: {'precision': 0.8991935483870968, 'recall': 0.9139344262295082, 'f1': 0.9065040650406504, 'number': 488}
- Iền: {'precision': 0.7551020408163265, 'recall': 0.9487179487179487, 'f1': 0.8409090909090908, 'number': 39}
- Iờ: {'precision': 0.6976744186046512, 'recall': 0.7894736842105263, 'f1': 0.7407407407407408, 'number': 38}
- Ã đơn: {'precision': 0.8994708994708994, 'recall': 0.8374384236453202, 'f1': 0.8673469387755102, 'number': 203}
- Đt: {'precision': 0.9165786694825766, 'recall': 0.9886104783599089, 'f1': 0.9512328767123288, 'number': 878}
- Đt trừu tượng: {'precision': 0.782608695652174, 'recall': 0.927038626609442, 'f1': 0.8487229862475442, 'number': 233}
- Ịa chỉ cụ thể: {'precision': 0.4897959183673469, 'recall': 0.5581395348837209, 'f1': 0.5217391304347826, 'number': 43}
- Overall Precision: 0.8789
- Overall Recall: 0.9253
- Overall F1: 0.9015
- Overall Accuracy: 0.9533
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Tk | A | Gày | Gày trừu tượng | Iền | Iờ | Ã đơn | Đt | Đt trừu tượng | Ịa chỉ cụ thể | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 344 | 0.2528 | {'precision': 0.7747747747747747, 'recall': 0.7413793103448276, 'f1': 0.7577092511013217, 'number': 116} | {'precision': 0.9351851851851852, 'recall': 0.9373549883990719, 'f1': 0.936268829663963, 'number': 431} | {'precision': 0.8181818181818182, 'recall': 0.7941176470588235, 'f1': 0.8059701492537314, 'number': 34} | {'precision': 0.8682170542635659, 'recall': 0.9180327868852459, 'f1': 0.8924302788844621, 'number': 488} | {'precision': 0.6491228070175439, 'recall': 0.9487179487179487, 'f1': 0.7708333333333334, 'number': 39} | {'precision': 0.5614035087719298, 'recall': 0.8421052631578947, 'f1': 0.6736842105263158, 'number': 38} | {'precision': 0.896551724137931, 'recall': 0.7684729064039408, 'f1': 0.8275862068965517, 'number': 203} | {'precision': 0.9250535331905781, 'recall': 0.9840546697038725, 'f1': 0.9536423841059603, 'number': 878} | {'precision': 0.5699481865284974, 'recall': 0.944206008583691, 'f1': 0.7108239095315024, 'number': 233} | {'precision': 0.16666666666666666, 'recall': 0.4418604651162791, 'f1': 0.24203821656050953, 'number': 43} | 0.8149 | 0.9161 | 0.8625 | 0.9208 |
| 0.1661 | 2.0 | 688 | 0.1649 | {'precision': 0.788135593220339, 'recall': 0.8017241379310345, 'f1': 0.7948717948717949, 'number': 116} | {'precision': 0.9192139737991266, 'recall': 0.9767981438515081, 'f1': 0.9471316085489313, 'number': 431} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.9077868852459017, 'recall': 0.9077868852459017, 'f1': 0.9077868852459017, 'number': 488} | {'precision': 0.7708333333333334, 'recall': 0.9487179487179487, 'f1': 0.8505747126436781, 'number': 39} | {'precision': 0.5303030303030303, 'recall': 0.9210526315789473, 'f1': 0.673076923076923, 'number': 38} | {'precision': 0.8423645320197044, 'recall': 0.8423645320197044, 'f1': 0.8423645320197044, 'number': 203} | {'precision': 0.9090909090909091, 'recall': 0.9908883826879271, 'f1': 0.9482288828337874, 'number': 878} | {'precision': 0.7951807228915663, 'recall': 0.8497854077253219, 'f1': 0.8215767634854771, 'number': 233} | {'precision': 0.49056603773584906, 'recall': 0.6046511627906976, 'f1': 0.5416666666666665, 'number': 43} | 0.8666 | 0.9289 | 0.8966 | 0.9522 |
| 0.0801 | 3.0 | 1032 | 0.1779 | {'precision': 0.7241379310344828, 'recall': 0.5431034482758621, 'f1': 0.6206896551724138, 'number': 116} | {'precision': 0.9190371991247265, 'recall': 0.974477958236659, 'f1': 0.9459459459459458, 'number': 431} | {'precision': 0.7368421052631579, 'recall': 0.8235294117647058, 'f1': 0.7777777777777778, 'number': 34} | {'precision': 0.8376865671641791, 'recall': 0.9200819672131147, 'f1': 0.876953125, 'number': 488} | {'precision': 0.6909090909090909, 'recall': 0.9743589743589743, 'f1': 0.8085106382978723, 'number': 39} | {'precision': 0.6511627906976745, 'recall': 0.7368421052631579, 'f1': 0.6913580246913581, 'number': 38} | {'precision': 0.907608695652174, 'recall': 0.8226600985221675, 'f1': 0.8630490956072351, 'number': 203} | {'precision': 0.9197860962566845, 'recall': 0.979498861047836, 'f1': 0.948703805846663, 'number': 878} | {'precision': 0.707641196013289, 'recall': 0.9141630901287554, 'f1': 0.7977528089887641, 'number': 233} | {'precision': 0.4489795918367347, 'recall': 0.5116279069767442, 'f1': 0.47826086956521735, 'number': 43} | 0.8521 | 0.9141 | 0.8820 | 0.9492 |
| 0.0801 | 4.0 | 1376 | 0.2001 | {'precision': 0.7111111111111111, 'recall': 0.8275862068965517, 'f1': 0.7649402390438247, 'number': 116} | {'precision': 0.9134199134199135, 'recall': 0.9791183294663574, 'f1': 0.9451287793952968, 'number': 431} | {'precision': 0.7111111111111111, 'recall': 0.9411764705882353, 'f1': 0.8101265822784811, 'number': 34} | {'precision': 0.8530534351145038, 'recall': 0.9159836065573771, 'f1': 0.883399209486166, 'number': 488} | {'precision': 0.7659574468085106, 'recall': 0.9230769230769231, 'f1': 0.8372093023255814, 'number': 39} | {'precision': 0.6875, 'recall': 0.5789473684210527, 'f1': 0.6285714285714286, 'number': 38} | {'precision': 0.8549222797927462, 'recall': 0.812807881773399, 'f1': 0.8333333333333334, 'number': 203} | {'precision': 0.9284188034188035, 'recall': 0.989749430523918, 'f1': 0.958103638368247, 'number': 878} | {'precision': 0.8148148148148148, 'recall': 0.8497854077253219, 'f1': 0.8319327731092436, 'number': 233} | {'precision': 0.32558139534883723, 'recall': 0.32558139534883723, 'f1': 0.32558139534883723, 'number': 43} | 0.8650 | 0.9193 | 0.8913 | 0.9462 |
| 0.0538 | 5.0 | 1720 | 0.2157 | {'precision': 0.810126582278481, 'recall': 0.5517241379310345, 'f1': 0.6564102564102563, 'number': 116} | {'precision': 0.9213973799126638, 'recall': 0.9791183294663574, 'f1': 0.9493813273340832, 'number': 431} | {'precision': 0.6458333333333334, 'recall': 0.9117647058823529, 'f1': 0.7560975609756098, 'number': 34} | {'precision': 0.8955823293172691, 'recall': 0.9139344262295082, 'f1': 0.9046653144016227, 'number': 488} | {'precision': 0.7708333333333334, 'recall': 0.9487179487179487, 'f1': 0.8505747126436781, 'number': 39} | {'precision': 0.6, 'recall': 0.9473684210526315, 'f1': 0.7346938775510204, 'number': 38} | {'precision': 0.8944444444444445, 'recall': 0.7931034482758621, 'f1': 0.8407310704960836, 'number': 203} | {'precision': 0.89375, 'recall': 0.9772209567198178, 'f1': 0.9336235038084875, 'number': 878} | {'precision': 0.8178137651821862, 'recall': 0.8669527896995708, 'f1': 0.8416666666666666, 'number': 233} | {'precision': 0.4473684210526316, 'recall': 0.3953488372093023, 'f1': 0.41975308641975306, 'number': 43} | 0.8693 | 0.9085 | 0.8885 | 0.9479 |
| 0.0329 | 6.0 | 2064 | 0.2184 | {'precision': 0.8118811881188119, 'recall': 0.7068965517241379, 'f1': 0.7557603686635945, 'number': 116} | {'precision': 0.9351230425055929, 'recall': 0.9698375870069605, 'f1': 0.9521640091116174, 'number': 431} | {'precision': 0.7272727272727273, 'recall': 0.9411764705882353, 'f1': 0.8205128205128205, 'number': 34} | {'precision': 0.9090909090909091, 'recall': 0.8811475409836066, 'f1': 0.8949011446409989, 'number': 488} | {'precision': 0.7115384615384616, 'recall': 0.9487179487179487, 'f1': 0.8131868131868132, 'number': 39} | {'precision': 0.5396825396825397, 'recall': 0.8947368421052632, 'f1': 0.6732673267326732, 'number': 38} | {'precision': 0.8492462311557789, 'recall': 0.8325123152709359, 'f1': 0.8407960199004975, 'number': 203} | {'precision': 0.9197860962566845, 'recall': 0.979498861047836, 'f1': 0.948703805846663, 'number': 878} | {'precision': 0.8108108108108109, 'recall': 0.9012875536480687, 'f1': 0.8536585365853658, 'number': 233} | {'precision': 0.3898305084745763, 'recall': 0.5348837209302325, 'f1': 0.4509803921568628, 'number': 43} | 0.8720 | 0.9169 | 0.8939 | 0.9529 |
| 0.0329 | 7.0 | 2408 | 0.2351 | {'precision': 0.7738095238095238, 'recall': 0.5603448275862069, 'f1': 0.65, 'number': 116} | {'precision': 0.9336283185840708, 'recall': 0.9791183294663574, 'f1': 0.9558323895809738, 'number': 431} | {'precision': 0.6956521739130435, 'recall': 0.9411764705882353, 'f1': 0.7999999999999999, 'number': 34} | {'precision': 0.8942115768463074, 'recall': 0.9180327868852459, 'f1': 0.9059656218402427, 'number': 488} | {'precision': 0.8043478260869565, 'recall': 0.9487179487179487, 'f1': 0.8705882352941177, 'number': 39} | {'precision': 0.6666666666666666, 'recall': 0.8947368421052632, 'f1': 0.7640449438202247, 'number': 38} | {'precision': 0.8956043956043956, 'recall': 0.8029556650246306, 'f1': 0.8467532467532467, 'number': 203} | {'precision': 0.8966942148760331, 'recall': 0.9886104783599089, 'f1': 0.9404117009750812, 'number': 878} | {'precision': 0.823076923076923, 'recall': 0.9184549356223176, 'f1': 0.8681541582150101, 'number': 233} | {'precision': 0.47058823529411764, 'recall': 0.5581395348837209, 'f1': 0.5106382978723404, 'number': 43} | 0.8735 | 0.9217 | 0.8970 | 0.9545 |
| 0.0188 | 8.0 | 2752 | 0.2681 | {'precision': 0.8173076923076923, 'recall': 0.7327586206896551, 'f1': 0.7727272727272727, 'number': 116} | {'precision': 0.9352678571428571, 'recall': 0.9721577726218097, 'f1': 0.9533560864618885, 'number': 431} | {'precision': 0.7272727272727273, 'recall': 0.9411764705882353, 'f1': 0.8205128205128205, 'number': 34} | {'precision': 0.8983739837398373, 'recall': 0.9057377049180327, 'f1': 0.9020408163265305, 'number': 488} | {'precision': 0.7872340425531915, 'recall': 0.9487179487179487, 'f1': 0.8604651162790696, 'number': 39} | {'precision': 0.6122448979591837, 'recall': 0.7894736842105263, 'f1': 0.6896551724137931, 'number': 38} | {'precision': 0.8961748633879781, 'recall': 0.8078817733990148, 'f1': 0.849740932642487, 'number': 203} | {'precision': 0.9460352422907489, 'recall': 0.9783599088838268, 'f1': 0.961926091825308, 'number': 878} | {'precision': 0.8185328185328186, 'recall': 0.9098712446351931, 'f1': 0.8617886178861789, 'number': 233} | {'precision': 0.3793103448275862, 'recall': 0.5116279069767442, 'f1': 0.43564356435643564, 'number': 43} | 0.8881 | 0.9197 | 0.9036 | 0.9533 |
| 0.0106 | 9.0 | 3096 | 0.2713 | {'precision': 0.7978723404255319, 'recall': 0.646551724137931, 'f1': 0.7142857142857143, 'number': 116} | {'precision': 0.9352678571428571, 'recall': 0.9721577726218097, 'f1': 0.9533560864618885, 'number': 431} | {'precision': 0.7111111111111111, 'recall': 0.9411764705882353, 'f1': 0.8101265822784811, 'number': 34} | {'precision': 0.9046653144016227, 'recall': 0.9139344262295082, 'f1': 0.9092762487257898, 'number': 488} | {'precision': 0.7708333333333334, 'recall': 0.9487179487179487, 'f1': 0.8505747126436781, 'number': 39} | {'precision': 0.64, 'recall': 0.8421052631578947, 'f1': 0.7272727272727272, 'number': 38} | {'precision': 0.8854166666666666, 'recall': 0.8374384236453202, 'f1': 0.8607594936708861, 'number': 203} | {'precision': 0.9127234490010515, 'recall': 0.9886104783599089, 'f1': 0.9491525423728814, 'number': 878} | {'precision': 0.8412698412698413, 'recall': 0.9098712446351931, 'f1': 0.8742268041237113, 'number': 233} | {'precision': 0.5, 'recall': 0.5116279069767442, 'f1': 0.5057471264367817, 'number': 43} | 0.8838 | 0.9241 | 0.9035 | 0.9529 |
| 0.0106 | 10.0 | 3440 | 0.2822 | {'precision': 0.8020833333333334, 'recall': 0.6637931034482759, 'f1': 0.7264150943396228, 'number': 116} | {'precision': 0.9369369369369369, 'recall': 0.9651972157772621, 'f1': 0.9508571428571428, 'number': 431} | {'precision': 0.6956521739130435, 'recall': 0.9411764705882353, 'f1': 0.7999999999999999, 'number': 34} | {'precision': 0.8991935483870968, 'recall': 0.9139344262295082, 'f1': 0.9065040650406504, 'number': 488} | {'precision': 0.7551020408163265, 'recall': 0.9487179487179487, 'f1': 0.8409090909090908, 'number': 39} | {'precision': 0.6976744186046512, 'recall': 0.7894736842105263, 'f1': 0.7407407407407408, 'number': 38} | {'precision': 0.8994708994708994, 'recall': 0.8374384236453202, 'f1': 0.8673469387755102, 'number': 203} | {'precision': 0.9165786694825766, 'recall': 0.9886104783599089, 'f1': 0.9512328767123288, 'number': 878} | {'precision': 0.782608695652174, 'recall': 0.927038626609442, 'f1': 0.8487229862475442, 'number': 233} | {'precision': 0.4897959183673469, 'recall': 0.5581395348837209, 'f1': 0.5217391304347826, 'number': 43} | 0.8789 | 0.9253 | 0.9015 | 0.9533 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support