roberta-large-ner-ghtk-cs-add-2label-50-new-data-3090-14Sep-1
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2865
- Tk: {'precision': 0.7570093457943925, 'recall': 0.6982758620689655, 'f1': 0.726457399103139, 'number': 116}
- A: {'precision': 0.9559164733178654, 'recall': 0.9559164733178654, 'f1': 0.9559164733178654, 'number': 431}
- Gày: {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34}
- Gày trừu tượng: {'precision': 0.9004065040650406, 'recall': 0.9077868852459017, 'f1': 0.9040816326530612, 'number': 488}
- Iờ: {'precision': 0.6136363636363636, 'recall': 0.7105263157894737, 'f1': 0.6585365853658537, 'number': 38}
- Ã đơn: {'precision': 0.8860103626943006, 'recall': 0.8423645320197044, 'f1': 0.8636363636363636, 'number': 203}
- Đt: {'precision': 0.9453551912568307, 'recall': 0.9851936218678815, 'f1': 0.9648633575013943, 'number': 878}
- Đt trừu tượng: {'precision': 0.7896825396825397, 'recall': 0.8540772532188842, 'f1': 0.820618556701031, 'number': 233}
- Overall Precision: 0.8999
- Overall Recall: 0.9207
- Overall F1: 0.9102
- Overall Accuracy: 0.9616
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Tk | A | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 396 | 0.1707 | {'precision': 0.7684210526315789, 'recall': 0.6293103448275862, 'f1': 0.6919431279620853, 'number': 116} | {'precision': 0.9210526315789473, 'recall': 0.974477958236659, 'f1': 0.9470124013528748, 'number': 431} | {'precision': 0.75, 'recall': 0.8823529411764706, 'f1': 0.8108108108108107, 'number': 34} | {'precision': 0.8651252408477842, 'recall': 0.9200819672131147, 'f1': 0.8917576961271102, 'number': 488} | {'precision': 0.4827586206896552, 'recall': 0.7368421052631579, 'f1': 0.5833333333333334, 'number': 38} | {'precision': 0.8722222222222222, 'recall': 0.7733990147783252, 'f1': 0.8198433420365536, 'number': 203} | {'precision': 0.9314775160599572, 'recall': 0.9908883826879271, 'f1': 0.9602649006622518, 'number': 878} | {'precision': 0.691358024691358, 'recall': 0.9613733905579399, 'f1': 0.8043087971274685, 'number': 233} | 0.8638 | 0.9298 | 0.8956 | 0.9557 |
| 0.0853 | 2.0 | 792 | 0.1964 | {'precision': 0.8131868131868132, 'recall': 0.6379310344827587, 'f1': 0.714975845410628, 'number': 116} | {'precision': 0.9727722772277227, 'recall': 0.9118329466357309, 'f1': 0.9413173652694611, 'number': 431} | {'precision': 0.6829268292682927, 'recall': 0.8235294117647058, 'f1': 0.7466666666666667, 'number': 34} | {'precision': 0.9620853080568721, 'recall': 0.8319672131147541, 'f1': 0.8923076923076924, 'number': 488} | {'precision': 0.5818181818181818, 'recall': 0.8421052631578947, 'f1': 0.6881720430107526, 'number': 38} | {'precision': 0.7613168724279835, 'recall': 0.9113300492610837, 'f1': 0.8295964125560539, 'number': 203} | {'precision': 0.927427961579509, 'recall': 0.989749430523918, 'f1': 0.9575757575757575, 'number': 878} | {'precision': 0.8245614035087719, 'recall': 0.8068669527896996, 'f1': 0.8156182212581344, 'number': 233} | 0.8984 | 0.8984 | 0.8984 | 0.9563 |
| 0.0429 | 3.0 | 1188 | 0.2005 | {'precision': 0.7604166666666666, 'recall': 0.6293103448275862, 'f1': 0.6886792452830188, 'number': 116} | {'precision': 0.9213973799126638, 'recall': 0.9791183294663574, 'f1': 0.9493813273340832, 'number': 431} | {'precision': 0.7380952380952381, 'recall': 0.9117647058823529, 'f1': 0.8157894736842106, 'number': 34} | {'precision': 0.9216101694915254, 'recall': 0.8913934426229508, 'f1': 0.90625, 'number': 488} | {'precision': 0.8461538461538461, 'recall': 0.2894736842105263, 'f1': 0.4313725490196079, 'number': 38} | {'precision': 0.8729281767955801, 'recall': 0.7783251231527094, 'f1': 0.8229166666666666, 'number': 203} | {'precision': 0.9331175836030206, 'recall': 0.9851936218678815, 'f1': 0.9584487534626038, 'number': 878} | {'precision': 0.7395833333333334, 'recall': 0.9141630901287554, 'f1': 0.817658349328215, 'number': 233} | 0.8914 | 0.9120 | 0.9016 | 0.9582 |
| 0.0343 | 4.0 | 1584 | 0.1970 | {'precision': 0.7391304347826086, 'recall': 0.7327586206896551, 'f1': 0.735930735930736, 'number': 116} | {'precision': 0.9343891402714932, 'recall': 0.9582366589327146, 'f1': 0.9461626575028637, 'number': 431} | {'precision': 0.6521739130434783, 'recall': 0.8823529411764706, 'f1': 0.75, 'number': 34} | {'precision': 0.8701550387596899, 'recall': 0.9200819672131147, 'f1': 0.894422310756972, 'number': 488} | {'precision': 0.6304347826086957, 'recall': 0.7631578947368421, 'f1': 0.6904761904761905, 'number': 38} | {'precision': 0.7822222222222223, 'recall': 0.8669950738916257, 'f1': 0.8224299065420562, 'number': 203} | {'precision': 0.946448087431694, 'recall': 0.9863325740318907, 'f1': 0.9659788064696041, 'number': 878} | {'precision': 0.7491039426523297, 'recall': 0.8969957081545065, 'f1': 0.8164062499999999, 'number': 233} | 0.8735 | 0.9323 | 0.9019 | 0.9578 |
| 0.0343 | 5.0 | 1980 | 0.2266 | {'precision': 0.7570093457943925, 'recall': 0.6982758620689655, 'f1': 0.726457399103139, 'number': 116} | {'precision': 0.9621749408983451, 'recall': 0.9443155452436195, 'f1': 0.9531615925058547, 'number': 431} | {'precision': 0.7045454545454546, 'recall': 0.9117647058823529, 'f1': 0.794871794871795, 'number': 34} | {'precision': 0.9066937119675457, 'recall': 0.9159836065573771, 'f1': 0.9113149847094801, 'number': 488} | {'precision': 0.6458333333333334, 'recall': 0.8157894736842105, 'f1': 0.7209302325581395, 'number': 38} | {'precision': 0.786046511627907, 'recall': 0.8325123152709359, 'f1': 0.8086124401913874, 'number': 203} | {'precision': 0.9466230936819172, 'recall': 0.989749430523918, 'f1': 0.9677060133630289, 'number': 878} | {'precision': 0.8285714285714286, 'recall': 0.871244635193133, 'f1': 0.8493723849372384, 'number': 233} | 0.8977 | 0.9244 | 0.9109 | 0.9592 |
| 0.0238 | 6.0 | 2376 | 0.2266 | {'precision': 0.75, 'recall': 0.75, 'f1': 0.75, 'number': 116} | {'precision': 0.9426605504587156, 'recall': 0.9535962877030162, 'f1': 0.9480968858131487, 'number': 431} | {'precision': 0.7142857142857143, 'recall': 0.8823529411764706, 'f1': 0.7894736842105262, 'number': 34} | {'precision': 0.9182389937106918, 'recall': 0.8975409836065574, 'f1': 0.9077720207253885, 'number': 488} | {'precision': 0.6363636363636364, 'recall': 0.7368421052631579, 'f1': 0.6829268292682926, 'number': 38} | {'precision': 0.9047619047619048, 'recall': 0.8423645320197044, 'f1': 0.8724489795918366, 'number': 203} | {'precision': 0.9558011049723757, 'recall': 0.9851936218678815, 'f1': 0.9702748177229389, 'number': 878} | {'precision': 0.7649253731343284, 'recall': 0.8798283261802575, 'f1': 0.8183632734530938, 'number': 233} | 0.9023 | 0.9232 | 0.9126 | 0.9624 |
| 0.014 | 7.0 | 2772 | 0.2611 | {'precision': 0.7340425531914894, 'recall': 0.5948275862068966, 'f1': 0.6571428571428571, 'number': 116} | {'precision': 0.9732360097323601, 'recall': 0.9280742459396751, 'f1': 0.9501187648456058, 'number': 431} | {'precision': 0.7272727272727273, 'recall': 0.9411764705882353, 'f1': 0.8205128205128205, 'number': 34} | {'precision': 0.9046653144016227, 'recall': 0.9139344262295082, 'f1': 0.9092762487257898, 'number': 488} | {'precision': 0.6444444444444445, 'recall': 0.7631578947368421, 'f1': 0.6987951807228916, 'number': 38} | {'precision': 0.8947368421052632, 'recall': 0.8374384236453202, 'f1': 0.8651399491094148, 'number': 203} | {'precision': 0.9192348565356004, 'recall': 0.9851936218678815, 'f1': 0.9510720175920835, 'number': 878} | {'precision': 0.7829457364341085, 'recall': 0.8669527896995708, 'f1': 0.8228105906313645, 'number': 233} | 0.8938 | 0.9141 | 0.9038 | 0.9607 |
| 0.0094 | 8.0 | 3168 | 0.2767 | {'precision': 0.7422680412371134, 'recall': 0.6206896551724138, 'f1': 0.6760563380281691, 'number': 116} | {'precision': 0.9564220183486238, 'recall': 0.9675174013921114, 'f1': 0.9619377162629758, 'number': 431} | {'precision': 0.6818181818181818, 'recall': 0.8823529411764706, 'f1': 0.7692307692307693, 'number': 34} | {'precision': 0.9014373716632443, 'recall': 0.8995901639344263, 'f1': 0.9005128205128204, 'number': 488} | {'precision': 0.6444444444444445, 'recall': 0.7631578947368421, 'f1': 0.6987951807228916, 'number': 38} | {'precision': 0.8769230769230769, 'recall': 0.8423645320197044, 'f1': 0.8592964824120604, 'number': 203} | {'precision': 0.938177874186551, 'recall': 0.9851936218678815, 'f1': 0.961111111111111, 'number': 878} | {'precision': 0.7509157509157509, 'recall': 0.8798283261802575, 'f1': 0.8102766798418972, 'number': 233} | 0.8916 | 0.9203 | 0.9057 | 0.9608 |
| 0.0041 | 9.0 | 3564 | 0.2789 | {'precision': 0.7450980392156863, 'recall': 0.6551724137931034, 'f1': 0.6972477064220184, 'number': 116} | {'precision': 0.9518348623853211, 'recall': 0.962877030162413, 'f1': 0.9573241061130334, 'number': 431} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.9004065040650406, 'recall': 0.9077868852459017, 'f1': 0.9040816326530612, 'number': 488} | {'precision': 0.6341463414634146, 'recall': 0.6842105263157895, 'f1': 0.6582278481012659, 'number': 38} | {'precision': 0.890625, 'recall': 0.8423645320197044, 'f1': 0.8658227848101266, 'number': 203} | {'precision': 0.9402173913043478, 'recall': 0.9851936218678815, 'f1': 0.9621802002224694, 'number': 878} | {'precision': 0.7803921568627451, 'recall': 0.8540772532188842, 'f1': 0.8155737704918034, 'number': 233} | 0.8972 | 0.9195 | 0.9082 | 0.9620 |
| 0.0041 | 10.0 | 3960 | 0.2865 | {'precision': 0.7570093457943925, 'recall': 0.6982758620689655, 'f1': 0.726457399103139, 'number': 116} | {'precision': 0.9559164733178654, 'recall': 0.9559164733178654, 'f1': 0.9559164733178654, 'number': 431} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.9004065040650406, 'recall': 0.9077868852459017, 'f1': 0.9040816326530612, 'number': 488} | {'precision': 0.6136363636363636, 'recall': 0.7105263157894737, 'f1': 0.6585365853658537, 'number': 38} | {'precision': 0.8860103626943006, 'recall': 0.8423645320197044, 'f1': 0.8636363636363636, 'number': 203} | {'precision': 0.9453551912568307, 'recall': 0.9851936218678815, 'f1': 0.9648633575013943, 'number': 878} | {'precision': 0.7896825396825397, 'recall': 0.8540772532188842, 'f1': 0.820618556701031, 'number': 233} | 0.8999 | 0.9207 | 0.9102 | 0.9616 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support