roberta-large-ner-ghtk-cs-add-4label-11-new-data-3090-24Sep-1

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3066
  • Tk: {'precision': 0.8857142857142857, 'recall': 0.5344827586206896, 'f1': 0.6666666666666667, 'number': 116}
  • A: {'precision': 0.9346846846846847, 'recall': 0.962877030162413, 'f1': 0.9485714285714286, 'number': 431}
  • Gày: {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34}
  • Gày trừu tượng: {'precision': 0.9087136929460581, 'recall': 0.8975409836065574, 'f1': 0.9030927835051547, 'number': 488}
  • Iền: {'precision': 0.7058823529411765, 'recall': 0.9230769230769231, 'f1': 0.8000000000000002, 'number': 39}
  • Iờ: {'precision': 0.6818181818181818, 'recall': 0.7894736842105263, 'f1': 0.7317073170731707, 'number': 38}
  • Ã đơn: {'precision': 0.8636363636363636, 'recall': 0.8423645320197044, 'f1': 0.8528678304239401, 'number': 203}
  • Đt: {'precision': 0.8948453608247423, 'recall': 0.9886104783599089, 'f1': 0.9393939393939394, 'number': 878}
  • Đt trừu tượng: {'precision': 0.7090301003344481, 'recall': 0.9098712446351931, 'f1': 0.7969924812030075, 'number': 233}
  • Ịa chỉ cụ thể: {'precision': 0.48, 'recall': 0.5581395348837209, 'f1': 0.5161290322580644, 'number': 43}
  • Overall Precision: 0.8631
  • Overall Recall: 0.9141
  • Overall F1: 0.8879
  • Overall Accuracy: 0.9530

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Tk A Gày Gày trừu tượng Iền Iờ Ã đơn Đt Đt trừu tượng Ịa chỉ cụ thể Overall Precision Overall Recall Overall F1 Overall Accuracy
No log 1.0 313 0.1929 {'precision': 0.7272727272727273, 'recall': 0.7586206896551724, 'f1': 0.7426160337552743, 'number': 116} {'precision': 0.9565217391304348, 'recall': 0.9187935034802784, 'f1': 0.9372781065088757, 'number': 431} {'precision': 0.6111111111111112, 'recall': 0.9705882352941176, 'f1': 0.7499999999999999, 'number': 34} {'precision': 0.8985507246376812, 'recall': 0.889344262295082, 'f1': 0.893923789907312, 'number': 488} {'precision': 0.6595744680851063, 'recall': 0.7948717948717948, 'f1': 0.7209302325581396, 'number': 39} {'precision': 0.6818181818181818, 'recall': 0.7894736842105263, 'f1': 0.7317073170731707, 'number': 38} {'precision': 0.7903930131004366, 'recall': 0.8916256157635468, 'f1': 0.8379629629629629, 'number': 203} {'precision': 0.9656357388316151, 'recall': 0.9601366742596811, 'f1': 0.9628783552255855, 'number': 878} {'precision': 0.8275862068965517, 'recall': 0.8240343347639485, 'f1': 0.8258064516129032, 'number': 233} {'precision': 0.4098360655737705, 'recall': 0.5813953488372093, 'f1': 0.4807692307692308, 'number': 43} 0.8808 0.9001 0.8903 0.9521
0.1529 2.0 626 0.1819 {'precision': 0.8269230769230769, 'recall': 0.7413793103448276, 'f1': 0.7818181818181817, 'number': 116} {'precision': 0.9400921658986175, 'recall': 0.9466357308584686, 'f1': 0.9433526011560693, 'number': 431} {'precision': 0.7333333333333333, 'recall': 0.9705882352941176, 'f1': 0.8354430379746834, 'number': 34} {'precision': 0.859073359073359, 'recall': 0.9118852459016393, 'f1': 0.8846918489065606, 'number': 488} {'precision': 0.7115384615384616, 'recall': 0.9487179487179487, 'f1': 0.8131868131868132, 'number': 39} {'precision': 0.6140350877192983, 'recall': 0.9210526315789473, 'f1': 0.7368421052631579, 'number': 38} {'precision': 0.8864864864864865, 'recall': 0.8078817733990148, 'f1': 0.845360824742268, 'number': 203} {'precision': 0.9470124013528749, 'recall': 0.9567198177676538, 'f1': 0.9518413597733711, 'number': 878} {'precision': 0.6796116504854369, 'recall': 0.9012875536480687, 'f1': 0.7749077490774907, 'number': 233} {'precision': 0.47368421052631576, 'recall': 0.627906976744186, 'f1': 0.5399999999999999, 'number': 43} 0.8629 0.9129 0.8872 0.9516
0.1529 3.0 939 0.2060 {'precision': 0.8072289156626506, 'recall': 0.5775862068965517, 'f1': 0.6733668341708542, 'number': 116} {'precision': 0.9172113289760349, 'recall': 0.9767981438515081, 'f1': 0.946067415730337, 'number': 431} {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34} {'precision': 0.9024896265560166, 'recall': 0.8913934426229508, 'f1': 0.8969072164948453, 'number': 488} {'precision': 0.75, 'recall': 0.9230769230769231, 'f1': 0.8275862068965517, 'number': 39} {'precision': 0.6363636363636364, 'recall': 0.9210526315789473, 'f1': 0.7526881720430109, 'number': 38} {'precision': 0.8688524590163934, 'recall': 0.7832512315270936, 'f1': 0.8238341968911916, 'number': 203} {'precision': 0.9015544041450777, 'recall': 0.9908883826879271, 'f1': 0.9441128594682583, 'number': 878} {'precision': 0.7833333333333333, 'recall': 0.8068669527896996, 'f1': 0.7949260042283298, 'number': 233} {'precision': 0.4418604651162791, 'recall': 0.4418604651162791, 'f1': 0.4418604651162791, 'number': 43} 0.8697 0.9037 0.8864 0.9527
0.0648 4.0 1252 0.2063 {'precision': 0.8260869565217391, 'recall': 0.6551724137931034, 'f1': 0.7307692307692308, 'number': 116} {'precision': 0.930648769574944, 'recall': 0.9651972157772621, 'f1': 0.9476082004555808, 'number': 431} {'precision': 0.6590909090909091, 'recall': 0.8529411764705882, 'f1': 0.7435897435897436, 'number': 34} {'precision': 0.8963414634146342, 'recall': 0.9036885245901639, 'f1': 0.8999999999999999, 'number': 488} {'precision': 0.7708333333333334, 'recall': 0.9487179487179487, 'f1': 0.8505747126436781, 'number': 39} {'precision': 0.660377358490566, 'recall': 0.9210526315789473, 'f1': 0.769230769230769, 'number': 38} {'precision': 0.8895027624309392, 'recall': 0.7931034482758621, 'f1': 0.8385416666666667, 'number': 203} {'precision': 0.9072916666666667, 'recall': 0.9920273348519362, 'f1': 0.9477693144722525, 'number': 878} {'precision': 0.7009966777408638, 'recall': 0.9055793991416309, 'f1': 0.7902621722846443, 'number': 233} {'precision': 0.4489795918367347, 'recall': 0.5116279069767442, 'f1': 0.47826086956521735, 'number': 43} 0.8620 0.9185 0.8894 0.9517
0.0447 5.0 1565 0.2427 {'precision': 0.88, 'recall': 0.5689655172413793, 'f1': 0.6910994764397906, 'number': 116} {'precision': 0.9136069114470843, 'recall': 0.9814385150812065, 'f1': 0.9463087248322148, 'number': 431} {'precision': 0.7380952380952381, 'recall': 0.9117647058823529, 'f1': 0.8157894736842106, 'number': 34} {'precision': 0.8882113821138211, 'recall': 0.8954918032786885, 'f1': 0.8918367346938775, 'number': 488} {'precision': 0.74, 'recall': 0.9487179487179487, 'f1': 0.8314606741573033, 'number': 39} {'precision': 0.6666666666666666, 'recall': 0.9473684210526315, 'f1': 0.782608695652174, 'number': 38} {'precision': 0.8709677419354839, 'recall': 0.7980295566502463, 'f1': 0.8329048843187661, 'number': 203} {'precision': 0.9118895966029724, 'recall': 0.9783599088838268, 'f1': 0.9439560439560439, 'number': 878} {'precision': 0.7420494699646644, 'recall': 0.9012875536480687, 'f1': 0.8139534883720931, 'number': 233} {'precision': 0.3787878787878788, 'recall': 0.5813953488372093, 'f1': 0.4587155963302752, 'number': 43} 0.8617 0.9133 0.8867 0.9497
0.0447 6.0 1878 0.2504 {'precision': 0.8414634146341463, 'recall': 0.5948275862068966, 'f1': 0.696969696969697, 'number': 116} {'precision': 0.9249448123620309, 'recall': 0.9721577726218097, 'f1': 0.9479638009049773, 'number': 431} {'precision': 0.6808510638297872, 'recall': 0.9411764705882353, 'f1': 0.7901234567901235, 'number': 34} {'precision': 0.9079754601226994, 'recall': 0.9098360655737705, 'f1': 0.9089048106448312, 'number': 488} {'precision': 0.7659574468085106, 'recall': 0.9230769230769231, 'f1': 0.8372093023255814, 'number': 39} {'precision': 0.6181818181818182, 'recall': 0.8947368421052632, 'f1': 0.7311827956989247, 'number': 38} {'precision': 0.8173076923076923, 'recall': 0.8374384236453202, 'f1': 0.8272506082725062, 'number': 203} {'precision': 0.9287257019438445, 'recall': 0.979498861047836, 'f1': 0.9534368070953437, 'number': 878} {'precision': 0.7242647058823529, 'recall': 0.8454935622317596, 'f1': 0.7801980198019802, 'number': 233} {'precision': 0.4489795918367347, 'recall': 0.5116279069767442, 'f1': 0.47826086956521735, 'number': 43} 0.8687 0.9121 0.8899 0.9516
0.0236 7.0 2191 0.2783 {'precision': 0.8918918918918919, 'recall': 0.5689655172413793, 'f1': 0.6947368421052632, 'number': 116} {'precision': 0.9339407744874715, 'recall': 0.951276102088167, 'f1': 0.9425287356321839, 'number': 431} {'precision': 0.6888888888888889, 'recall': 0.9117647058823529, 'f1': 0.7848101265822784, 'number': 34} {'precision': 0.883629191321499, 'recall': 0.9180327868852459, 'f1': 0.9005025125628141, 'number': 488} {'precision': 0.6792452830188679, 'recall': 0.9230769230769231, 'f1': 0.7826086956521738, 'number': 39} {'precision': 0.6346153846153846, 'recall': 0.868421052631579, 'f1': 0.7333333333333333, 'number': 38} {'precision': 0.8512820512820513, 'recall': 0.8177339901477833, 'f1': 0.8341708542713567, 'number': 203} {'precision': 0.8912820512820513, 'recall': 0.989749430523918, 'f1': 0.937938478143551, 'number': 878} {'precision': 0.7250859106529209, 'recall': 0.9055793991416309, 'f1': 0.8053435114503816, 'number': 233} {'precision': 0.46296296296296297, 'recall': 0.5813953488372093, 'f1': 0.5154639175257731, 'number': 43} 0.8547 0.9169 0.8847 0.9500
0.0139 8.0 2504 0.2999 {'precision': 0.9090909090909091, 'recall': 0.43103448275862066, 'f1': 0.5847953216374269, 'number': 116} {'precision': 0.9403669724770642, 'recall': 0.951276102088167, 'f1': 0.9457900807381776, 'number': 431} {'precision': 0.6739130434782609, 'recall': 0.9117647058823529, 'f1': 0.775, 'number': 34} {'precision': 0.9074074074074074, 'recall': 0.9036885245901639, 'f1': 0.9055441478439424, 'number': 488} {'precision': 0.7142857142857143, 'recall': 0.8974358974358975, 'f1': 0.7954545454545455, 'number': 39} {'precision': 0.6382978723404256, 'recall': 0.7894736842105263, 'f1': 0.7058823529411764, 'number': 38} {'precision': 0.8804347826086957, 'recall': 0.7980295566502463, 'f1': 0.8372093023255814, 'number': 203} {'precision': 0.9007391763463569, 'recall': 0.9715261958997722, 'f1': 0.9347945205479452, 'number': 878} {'precision': 0.6864686468646864, 'recall': 0.8927038626609443, 'f1': 0.7761194029850745, 'number': 233} {'precision': 0.46296296296296297, 'recall': 0.5813953488372093, 'f1': 0.5154639175257731, 'number': 43} 0.8611 0.8969 0.8787 0.9495
0.0139 9.0 2817 0.2964 {'precision': 0.9230769230769231, 'recall': 0.5172413793103449, 'f1': 0.6629834254143646, 'number': 116} {'precision': 0.9325842696629213, 'recall': 0.962877030162413, 'f1': 0.9474885844748857, 'number': 431} {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34} {'precision': 0.9032921810699589, 'recall': 0.8995901639344263, 'f1': 0.9014373716632444, 'number': 488} {'precision': 0.7142857142857143, 'recall': 0.8974358974358975, 'f1': 0.7954545454545455, 'number': 39} {'precision': 0.6666666666666666, 'recall': 0.7894736842105263, 'f1': 0.7228915662650601, 'number': 38} {'precision': 0.8711340206185567, 'recall': 0.8325123152709359, 'f1': 0.8513853904282116, 'number': 203} {'precision': 0.8963730569948186, 'recall': 0.9851936218678815, 'f1': 0.9386869234943028, 'number': 878} {'precision': 0.71280276816609, 'recall': 0.8841201716738197, 'f1': 0.789272030651341, 'number': 233} {'precision': 0.5227272727272727, 'recall': 0.5348837209302325, 'f1': 0.5287356321839081, 'number': 43} 0.8663 0.9085 0.8869 0.9528
0.0078 10.0 3130 0.3066 {'precision': 0.8857142857142857, 'recall': 0.5344827586206896, 'f1': 0.6666666666666667, 'number': 116} {'precision': 0.9346846846846847, 'recall': 0.962877030162413, 'f1': 0.9485714285714286, 'number': 431} {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34} {'precision': 0.9087136929460581, 'recall': 0.8975409836065574, 'f1': 0.9030927835051547, 'number': 488} {'precision': 0.7058823529411765, 'recall': 0.9230769230769231, 'f1': 0.8000000000000002, 'number': 39} {'precision': 0.6818181818181818, 'recall': 0.7894736842105263, 'f1': 0.7317073170731707, 'number': 38} {'precision': 0.8636363636363636, 'recall': 0.8423645320197044, 'f1': 0.8528678304239401, 'number': 203} {'precision': 0.8948453608247423, 'recall': 0.9886104783599089, 'f1': 0.9393939393939394, 'number': 878} {'precision': 0.7090301003344481, 'recall': 0.9098712446351931, 'f1': 0.7969924812030075, 'number': 233} {'precision': 0.48, 'recall': 0.5581395348837209, 'f1': 0.5161290322580644, 'number': 43} 0.8631 0.9141 0.8879 0.9530

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support