roberta-large-ner-ghtk-cs-add-2label-33-new-data-3090-14Sep-1

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2991
  • Tk: {'precision': 0.7282608695652174, 'recall': 0.5775862068965517, 'f1': 0.6442307692307693, 'number': 116}
  • A: {'precision': 0.9455782312925171, 'recall': 0.9675174013921114, 'f1': 0.9564220183486238, 'number': 431}
  • Gày: {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34}
  • Gày trừu tượng: {'precision': 0.9117043121149897, 'recall': 0.9098360655737705, 'f1': 0.9107692307692308, 'number': 488}
  • Iờ: {'precision': 0.625, 'recall': 0.7894736842105263, 'f1': 0.6976744186046512, 'number': 38}
  • Ã đơn: {'precision': 0.8666666666666667, 'recall': 0.8325123152709359, 'f1': 0.8492462311557789, 'number': 203}
  • Đt: {'precision': 0.9288025889967637, 'recall': 0.9806378132118451, 'f1': 0.954016620498615, 'number': 878}
  • Đt trừu tượng: {'precision': 0.7464285714285714, 'recall': 0.8969957081545065, 'f1': 0.8148148148148148, 'number': 233}
  • Overall Precision: 0.8870
  • Overall Recall: 0.9207
  • Overall F1: 0.9035
  • Overall Accuracy: 0.9598

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Tk A Gày Gày trừu tượng Iờ Ã đơn Đt Đt trừu tượng Overall Precision Overall Recall Overall F1 Overall Accuracy
No log 1.0 348 0.1690 {'precision': 0.8, 'recall': 0.7586206896551724, 'f1': 0.7787610619469026, 'number': 116} {'precision': 0.9522673031026253, 'recall': 0.925754060324826, 'f1': 0.9388235294117648, 'number': 431} {'precision': 0.7111111111111111, 'recall': 0.9411764705882353, 'f1': 0.8101265822784811, 'number': 34} {'precision': 0.8947368421052632, 'recall': 0.9057377049180327, 'f1': 0.90020366598778, 'number': 488} {'precision': 0.5660377358490566, 'recall': 0.7894736842105263, 'f1': 0.6593406593406593, 'number': 38} {'precision': 0.7850467289719626, 'recall': 0.8275862068965517, 'f1': 0.8057553956834531, 'number': 203} {'precision': 0.9448051948051948, 'recall': 0.9943052391799544, 'f1': 0.9689234184239734, 'number': 878} {'precision': 0.8271604938271605, 'recall': 0.8626609442060086, 'f1': 0.8445378151260504, 'number': 233} 0.8925 0.9223 0.9072 0.9559
0.0833 2.0 696 0.1926 {'precision': 0.66, 'recall': 0.5689655172413793, 'f1': 0.6111111111111112, 'number': 116} {'precision': 0.9298642533936652, 'recall': 0.9535962877030162, 'f1': 0.9415807560137456, 'number': 431} {'precision': 0.62, 'recall': 0.9117647058823529, 'f1': 0.7380952380952381, 'number': 34} {'precision': 0.8772277227722772, 'recall': 0.9077868852459017, 'f1': 0.892245720040282, 'number': 488} {'precision': 0.56, 'recall': 0.7368421052631579, 'f1': 0.6363636363636364, 'number': 38} {'precision': 0.7850877192982456, 'recall': 0.8817733990147784, 'f1': 0.8306264501160093, 'number': 203} {'precision': 0.9048117154811716, 'recall': 0.9851936218678815, 'f1': 0.9432933478735005, 'number': 878} {'precision': 0.7768924302788844, 'recall': 0.8369098712446352, 'f1': 0.8057851239669422, 'number': 233} 0.8590 0.9162 0.8867 0.9538
0.0488 3.0 1044 0.1996 {'precision': 0.7307692307692307, 'recall': 0.6551724137931034, 'f1': 0.6909090909090909, 'number': 116} {'precision': 0.9666666666666667, 'recall': 0.9419953596287703, 'f1': 0.9541715628672152, 'number': 431} {'precision': 0.6739130434782609, 'recall': 0.9117647058823529, 'f1': 0.775, 'number': 34} {'precision': 0.8665377176015474, 'recall': 0.9180327868852459, 'f1': 0.8915422885572141, 'number': 488} {'precision': 0.5614035087719298, 'recall': 0.8421052631578947, 'f1': 0.6736842105263158, 'number': 38} {'precision': 0.8648648648648649, 'recall': 0.7881773399014779, 'f1': 0.8247422680412372, 'number': 203} {'precision': 0.9301075268817204, 'recall': 0.9851936218678815, 'f1': 0.956858407079646, 'number': 878} {'precision': 0.8015873015873016, 'recall': 0.8669527896995708, 'f1': 0.8329896907216494, 'number': 233} 0.8841 0.9170 0.9002 0.9565
0.0488 4.0 1392 0.2225 {'precision': 0.7058823529411765, 'recall': 0.3103448275862069, 'f1': 0.4311377245508982, 'number': 116} {'precision': 0.9217391304347826, 'recall': 0.9837587006960556, 'f1': 0.951739618406285, 'number': 431} {'precision': 0.6976744186046512, 'recall': 0.8823529411764706, 'f1': 0.7792207792207793, 'number': 34} {'precision': 0.8997955010224948, 'recall': 0.9016393442622951, 'f1': 0.9007164790174003, 'number': 488} {'precision': 0.8181818181818182, 'recall': 0.47368421052631576, 'f1': 0.6, 'number': 38} {'precision': 0.9132947976878613, 'recall': 0.7783251231527094, 'f1': 0.8404255319148937, 'number': 203} {'precision': 0.8951194184839044, 'recall': 0.9817767653758542, 'f1': 0.9364475828354155, 'number': 878} {'precision': 0.875, 'recall': 0.8412017167381974, 'f1': 0.8577680525164113, 'number': 233} 0.8924 0.8938 0.8931 0.9595
0.0298 5.0 1740 0.2726 {'precision': 0.7619047619047619, 'recall': 0.41379310344827586, 'f1': 0.5363128491620113, 'number': 116} {'precision': 0.9136069114470843, 'recall': 0.9814385150812065, 'f1': 0.9463087248322148, 'number': 431} {'precision': 0.7111111111111111, 'recall': 0.9411764705882353, 'f1': 0.8101265822784811, 'number': 34} {'precision': 0.8687258687258688, 'recall': 0.9221311475409836, 'f1': 0.8946322067594434, 'number': 488} {'precision': 0.6296296296296297, 'recall': 0.8947368421052632, 'f1': 0.7391304347826088, 'number': 38} {'precision': 0.819047619047619, 'recall': 0.8472906403940886, 'f1': 0.8329297820823244, 'number': 203} {'precision': 0.8856848609680742, 'recall': 0.979498861047836, 'f1': 0.9302325581395349, 'number': 878} {'precision': 0.7248322147651006, 'recall': 0.927038626609442, 'f1': 0.8135593220338982, 'number': 233} 0.8524 0.9232 0.8864 0.9535
0.0179 6.0 2088 0.2373 {'precision': 0.7450980392156863, 'recall': 0.6551724137931034, 'f1': 0.6972477064220184, 'number': 116} {'precision': 0.9450800915331807, 'recall': 0.9582366589327146, 'f1': 0.9516129032258063, 'number': 431} {'precision': 0.7692307692307693, 'recall': 0.8823529411764706, 'f1': 0.8219178082191781, 'number': 34} {'precision': 0.8800773694390716, 'recall': 0.9323770491803278, 'f1': 0.9054726368159204, 'number': 488} {'precision': 0.5535714285714286, 'recall': 0.8157894736842105, 'f1': 0.6595744680851064, 'number': 38} {'precision': 0.9294117647058824, 'recall': 0.7783251231527094, 'f1': 0.8471849865951743, 'number': 203} {'precision': 0.9377729257641921, 'recall': 0.9783599088838268, 'f1': 0.9576365663322185, 'number': 878} {'precision': 0.75, 'recall': 0.8884120171673819, 'f1': 0.8133595284872298, 'number': 233} 0.8870 0.9207 0.9035 0.9596
0.0179 7.0 2436 0.2565 {'precision': 0.7529411764705882, 'recall': 0.5517241379310345, 'f1': 0.6368159203980099, 'number': 116} {'precision': 0.955607476635514, 'recall': 0.9489559164733179, 'f1': 0.9522700814901047, 'number': 431} {'precision': 0.7380952380952381, 'recall': 0.9117647058823529, 'f1': 0.8157894736842106, 'number': 34} {'precision': 0.9050505050505051, 'recall': 0.9180327868852459, 'f1': 0.9114954221770092, 'number': 488} {'precision': 0.6, 'recall': 0.7894736842105263, 'f1': 0.6818181818181819, 'number': 38} {'precision': 0.8894736842105263, 'recall': 0.8325123152709359, 'f1': 0.8600508905852418, 'number': 203} {'precision': 0.9348534201954397, 'recall': 0.9806378132118451, 'f1': 0.9571984435797665, 'number': 878} {'precision': 0.7573529411764706, 'recall': 0.8841201716738197, 'f1': 0.8158415841584158, 'number': 233} 0.8933 0.9162 0.9046 0.9614
0.015 8.0 2784 0.2785 {'precision': 0.7402597402597403, 'recall': 0.49137931034482757, 'f1': 0.5906735751295338, 'number': 116} {'precision': 0.9373601789709173, 'recall': 0.9721577726218097, 'f1': 0.9544419134396355, 'number': 431} {'precision': 0.6808510638297872, 'recall': 0.9411764705882353, 'f1': 0.7901234567901235, 'number': 34} {'precision': 0.8975903614457831, 'recall': 0.9159836065573771, 'f1': 0.9066937119675456, 'number': 488} {'precision': 0.5833333333333334, 'recall': 0.7368421052631579, 'f1': 0.6511627906976745, 'number': 38} {'precision': 0.8622448979591837, 'recall': 0.8325123152709359, 'f1': 0.8471177944862155, 'number': 203} {'precision': 0.9082278481012658, 'recall': 0.9806378132118451, 'f1': 0.9430449069003285, 'number': 878} {'precision': 0.762962962962963, 'recall': 0.8841201716738197, 'f1': 0.8190854870775347, 'number': 233} 0.8767 0.9166 0.8962 0.9580
0.0054 9.0 3132 0.3039 {'precision': 0.7974683544303798, 'recall': 0.5431034482758621, 'f1': 0.6461538461538462, 'number': 116} {'precision': 0.9518348623853211, 'recall': 0.962877030162413, 'f1': 0.9573241061130334, 'number': 431} {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34} {'precision': 0.9134020618556701, 'recall': 0.9077868852459017, 'f1': 0.9105858170606372, 'number': 488} {'precision': 0.62, 'recall': 0.8157894736842105, 'f1': 0.7045454545454546, 'number': 38} {'precision': 0.864321608040201, 'recall': 0.8472906403940886, 'f1': 0.8557213930348258, 'number': 203} {'precision': 0.9159574468085107, 'recall': 0.9806378132118451, 'f1': 0.9471947194719472, 'number': 878} {'precision': 0.7857142857142857, 'recall': 0.8969957081545065, 'f1': 0.8376753507014028, 'number': 233} 0.8911 0.9195 0.9051 0.9597
0.0054 10.0 3480 0.2991 {'precision': 0.7282608695652174, 'recall': 0.5775862068965517, 'f1': 0.6442307692307693, 'number': 116} {'precision': 0.9455782312925171, 'recall': 0.9675174013921114, 'f1': 0.9564220183486238, 'number': 431} {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34} {'precision': 0.9117043121149897, 'recall': 0.9098360655737705, 'f1': 0.9107692307692308, 'number': 488} {'precision': 0.625, 'recall': 0.7894736842105263, 'f1': 0.6976744186046512, 'number': 38} {'precision': 0.8666666666666667, 'recall': 0.8325123152709359, 'f1': 0.8492462311557789, 'number': 203} {'precision': 0.9288025889967637, 'recall': 0.9806378132118451, 'f1': 0.954016620498615, 'number': 878} {'precision': 0.7464285714285714, 'recall': 0.8969957081545065, 'f1': 0.8148148148148148, 'number': 233} 0.8870 0.9207 0.9035 0.9598

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support