roberta-large-ner-ghtk-cs-add-2label-new-data-3090-14Sep-1
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2751
- Tk: {'precision': 0.7903225806451613, 'recall': 0.8448275862068966, 'f1': 0.8166666666666667, 'number': 116}
- A: {'precision': 0.9431818181818182, 'recall': 0.962877030162413, 'f1': 0.9529276693455798, 'number': 431}
- Gày: {'precision': 0.7142857142857143, 'recall': 0.8823529411764706, 'f1': 0.7894736842105262, 'number': 34}
- Gày trừu tượng: {'precision': 0.9111570247933884, 'recall': 0.9036885245901639, 'f1': 0.9074074074074073, 'number': 488}
- Iờ: {'precision': 0.6904761904761905, 'recall': 0.7631578947368421, 'f1': 0.725, 'number': 38}
- Ã đơn: {'precision': 0.86, 'recall': 0.8472906403940886, 'f1': 0.8535980148883374, 'number': 203}
- Đt: {'precision': 0.95092693565976, 'recall': 0.9931662870159453, 'f1': 0.9715877437325905, 'number': 878}
- Đt trừu tượng: {'precision': 0.7851851851851852, 'recall': 0.9098712446351931, 'f1': 0.8429423459244533, 'number': 233}
- Overall Precision: 0.9008
- Overall Recall: 0.9372
- Overall F1: 0.9186
- Overall Accuracy: 0.9637
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Tk | A | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 396 | 0.1786 | {'precision': 0.7410714285714286, 'recall': 0.7155172413793104, 'f1': 0.7280701754385966, 'number': 116} | {'precision': 0.9142857142857143, 'recall': 0.9651972157772621, 'f1': 0.9390519187358917, 'number': 431} | {'precision': 0.7045454545454546, 'recall': 0.9117647058823529, 'f1': 0.794871794871795, 'number': 34} | {'precision': 0.8877755511022044, 'recall': 0.9077868852459017, 'f1': 0.8976697061803446, 'number': 488} | {'precision': 0.5384615384615384, 'recall': 0.9210526315789473, 'f1': 0.6796116504854369, 'number': 38} | {'precision': 0.8492462311557789, 'recall': 0.8325123152709359, 'f1': 0.8407960199004975, 'number': 203} | {'precision': 0.9314775160599572, 'recall': 0.9908883826879271, 'f1': 0.9602649006622518, 'number': 878} | {'precision': 0.7013422818791947, 'recall': 0.8969957081545065, 'f1': 0.7871939736346517, 'number': 233} | 0.8657 | 0.9318 | 0.8976 | 0.9559 |
| 0.0857 | 2.0 | 792 | 0.1725 | {'precision': 0.7122302158273381, 'recall': 0.853448275862069, 'f1': 0.776470588235294, 'number': 116} | {'precision': 0.9173913043478261, 'recall': 0.9791183294663574, 'f1': 0.9472502805836139, 'number': 431} | {'precision': 0.6078431372549019, 'recall': 0.9117647058823529, 'f1': 0.7294117647058823, 'number': 34} | {'precision': 0.923728813559322, 'recall': 0.8934426229508197, 'f1': 0.9083333333333333, 'number': 488} | {'precision': 0.5892857142857143, 'recall': 0.868421052631579, 'f1': 0.7021276595744681, 'number': 38} | {'precision': 0.7154150197628458, 'recall': 0.8916256157635468, 'f1': 0.793859649122807, 'number': 203} | {'precision': 0.9510337323177367, 'recall': 0.9954441913439636, 'f1': 0.9727323316638843, 'number': 878} | {'precision': 0.78515625, 'recall': 0.8626609442060086, 'f1': 0.8220858895705522, 'number': 233} | 0.8738 | 0.9405 | 0.9059 | 0.9541 |
| 0.0425 | 3.0 | 1188 | 0.1866 | {'precision': 0.8241758241758241, 'recall': 0.646551724137931, 'f1': 0.7246376811594202, 'number': 116} | {'precision': 0.9375, 'recall': 0.974477958236659, 'f1': 0.9556313993174061, 'number': 431} | {'precision': 0.7272727272727273, 'recall': 0.9411764705882353, 'f1': 0.8205128205128205, 'number': 34} | {'precision': 0.8866799204771372, 'recall': 0.9139344262295082, 'f1': 0.900100908173562, 'number': 488} | {'precision': 0.5757575757575758, 'recall': 0.5, 'f1': 0.5352112676056339, 'number': 38} | {'precision': 0.8756476683937824, 'recall': 0.8325123152709359, 'f1': 0.8535353535353536, 'number': 203} | {'precision': 0.8954918032786885, 'recall': 0.9954441913439636, 'f1': 0.9428263214670981, 'number': 878} | {'precision': 0.7838827838827839, 'recall': 0.9184549356223176, 'f1': 0.8458498023715415, 'number': 233} | 0.8782 | 0.9290 | 0.9029 | 0.9614 |
| 0.0323 | 4.0 | 1584 | 0.2327 | {'precision': 0.8, 'recall': 0.6551724137931034, 'f1': 0.7203791469194314, 'number': 116} | {'precision': 0.970873786407767, 'recall': 0.9280742459396751, 'f1': 0.9489916963226571, 'number': 431} | {'precision': 0.673469387755102, 'recall': 0.9705882352941176, 'f1': 0.7951807228915663, 'number': 34} | {'precision': 0.9192546583850931, 'recall': 0.9098360655737705, 'f1': 0.9145211122554068, 'number': 488} | {'precision': 0.65, 'recall': 0.6842105263157895, 'f1': 0.6666666666666667, 'number': 38} | {'precision': 0.726530612244898, 'recall': 0.8768472906403941, 'f1': 0.7946428571428572, 'number': 203} | {'precision': 0.9368879216539717, 'recall': 0.9806378132118451, 'f1': 0.9582637729549248, 'number': 878} | {'precision': 0.8447488584474886, 'recall': 0.7939914163090128, 'f1': 0.8185840707964601, 'number': 233} | 0.8948 | 0.9100 | 0.9023 | 0.9532 |
| 0.0323 | 5.0 | 1980 | 0.1994 | {'precision': 0.8380952380952381, 'recall': 0.7586206896551724, 'f1': 0.7963800904977376, 'number': 116} | {'precision': 0.9576470588235294, 'recall': 0.9443155452436195, 'f1': 0.9509345794392524, 'number': 431} | {'precision': 0.7380952380952381, 'recall': 0.9117647058823529, 'f1': 0.8157894736842106, 'number': 34} | {'precision': 0.900990099009901, 'recall': 0.9323770491803278, 'f1': 0.9164149043303121, 'number': 488} | {'precision': 0.6296296296296297, 'recall': 0.8947368421052632, 'f1': 0.7391304347826088, 'number': 38} | {'precision': 0.8882978723404256, 'recall': 0.8226600985221675, 'f1': 0.8542199488491049, 'number': 203} | {'precision': 0.958980044345898, 'recall': 0.9851936218678815, 'f1': 0.9719101123595506, 'number': 878} | {'precision': 0.7581227436823105, 'recall': 0.9012875536480687, 'f1': 0.8235294117647058, 'number': 233} | 0.9035 | 0.9323 | 0.9177 | 0.9645 |
| 0.0233 | 6.0 | 2376 | 0.2238 | {'precision': 0.776, 'recall': 0.8362068965517241, 'f1': 0.8049792531120333, 'number': 116} | {'precision': 0.9345372460496614, 'recall': 0.9605568445475638, 'f1': 0.9473684210526316, 'number': 431} | {'precision': 0.6976744186046512, 'recall': 0.8823529411764706, 'f1': 0.7792207792207793, 'number': 34} | {'precision': 0.9005964214711729, 'recall': 0.9282786885245902, 'f1': 0.9142280524722503, 'number': 488} | {'precision': 0.6222222222222222, 'recall': 0.7368421052631579, 'f1': 0.6746987951807228, 'number': 38} | {'precision': 0.8802083333333334, 'recall': 0.8325123152709359, 'f1': 0.8556962025316456, 'number': 203} | {'precision': 0.9665178571428571, 'recall': 0.9863325740318907, 'f1': 0.9763246899661782, 'number': 878} | {'precision': 0.8388429752066116, 'recall': 0.871244635193133, 'f1': 0.854736842105263, 'number': 233} | 0.9080 | 0.9335 | 0.9206 | 0.9654 |
| 0.0132 | 7.0 | 2772 | 0.2547 | {'precision': 0.8349514563106796, 'recall': 0.7413793103448276, 'f1': 0.7853881278538813, 'number': 116} | {'precision': 0.9569377990430622, 'recall': 0.9280742459396751, 'f1': 0.9422850412249707, 'number': 431} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.9216494845360824, 'recall': 0.9159836065573771, 'f1': 0.9188078108941418, 'number': 488} | {'precision': 0.7222222222222222, 'recall': 0.6842105263157895, 'f1': 0.7027027027027027, 'number': 38} | {'precision': 0.8802083333333334, 'recall': 0.8325123152709359, 'f1': 0.8556962025316456, 'number': 203} | {'precision': 0.9506037321624589, 'recall': 0.9863325740318907, 'f1': 0.9681386249301285, 'number': 878} | {'precision': 0.7509433962264151, 'recall': 0.8540772532188842, 'f1': 0.7991967871485944, 'number': 233} | 0.9066 | 0.9186 | 0.9126 | 0.9623 |
| 0.0078 | 8.0 | 3168 | 0.2535 | {'precision': 0.7903225806451613, 'recall': 0.8448275862068966, 'f1': 0.8166666666666667, 'number': 116} | {'precision': 0.9387755102040817, 'recall': 0.9605568445475638, 'f1': 0.9495412844036697, 'number': 431} | {'precision': 0.7045454545454546, 'recall': 0.9117647058823529, 'f1': 0.794871794871795, 'number': 34} | {'precision': 0.9195876288659793, 'recall': 0.9139344262295082, 'f1': 0.9167523124357656, 'number': 488} | {'precision': 0.6818181818181818, 'recall': 0.7894736842105263, 'f1': 0.7317073170731707, 'number': 38} | {'precision': 0.87, 'recall': 0.8571428571428571, 'f1': 0.8635235732009925, 'number': 203} | {'precision': 0.9570957095709571, 'recall': 0.9908883826879271, 'f1': 0.9736989367655289, 'number': 878} | {'precision': 0.803030303030303, 'recall': 0.9098712446351931, 'f1': 0.8531187122736418, 'number': 233} | 0.9060 | 0.9397 | 0.9225 | 0.9646 |
| 0.0038 | 9.0 | 3564 | 0.2686 | {'precision': 0.765625, 'recall': 0.8448275862068966, 'f1': 0.8032786885245901, 'number': 116} | {'precision': 0.9494252873563218, 'recall': 0.9582366589327146, 'f1': 0.9538106235565819, 'number': 431} | {'precision': 0.6818181818181818, 'recall': 0.8823529411764706, 'f1': 0.7692307692307693, 'number': 34} | {'precision': 0.8997995991983968, 'recall': 0.9200819672131147, 'f1': 0.9098277608915907, 'number': 488} | {'precision': 0.6744186046511628, 'recall': 0.7631578947368421, 'f1': 0.7160493827160495, 'number': 38} | {'precision': 0.8826530612244898, 'recall': 0.8522167487684729, 'f1': 0.8671679197994987, 'number': 203} | {'precision': 0.9510337323177367, 'recall': 0.9954441913439636, 'f1': 0.9727323316638843, 'number': 878} | {'precision': 0.7709090909090909, 'recall': 0.9098712446351931, 'f1': 0.8346456692913385, 'number': 233} | 0.8972 | 0.9409 | 0.9185 | 0.9642 |
| 0.0038 | 10.0 | 3960 | 0.2751 | {'precision': 0.7903225806451613, 'recall': 0.8448275862068966, 'f1': 0.8166666666666667, 'number': 116} | {'precision': 0.9431818181818182, 'recall': 0.962877030162413, 'f1': 0.9529276693455798, 'number': 431} | {'precision': 0.7142857142857143, 'recall': 0.8823529411764706, 'f1': 0.7894736842105262, 'number': 34} | {'precision': 0.9111570247933884, 'recall': 0.9036885245901639, 'f1': 0.9074074074074073, 'number': 488} | {'precision': 0.6904761904761905, 'recall': 0.7631578947368421, 'f1': 0.725, 'number': 38} | {'precision': 0.86, 'recall': 0.8472906403940886, 'f1': 0.8535980148883374, 'number': 203} | {'precision': 0.95092693565976, 'recall': 0.9931662870159453, 'f1': 0.9715877437325905, 'number': 878} | {'precision': 0.7851851851851852, 'recall': 0.9098712446351931, 'f1': 0.8429423459244533, 'number': 233} | 0.9008 | 0.9372 | 0.9186 | 0.9637 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support