Implementation3 / metrics_BERT_results.md
miasambolec's picture
Upload 18 files
9c2a3b5 verified

Evaluation Results

Test-1

              precision    recall  f1-score   support

           0       0.48      0.45      0.47       165
           1       0.08      0.10      0.09        58
           2       0.70      0.68      0.69       430

    accuracy                           0.57       653
   macro avg       0.42      0.41      0.42       653
weighted avg       0.59      0.57      0.58       653

Test-2

              precision    recall  f1-score   support

           0       0.67      0.57      0.62       216
           1       0.84      0.50      0.63       431
           2       0.24      0.77      0.37        94

    accuracy                           0.56       741
   macro avg       0.58      0.61      0.54       741
weighted avg       0.72      0.56      0.59       741

Test-3

              precision    recall  f1-score   support

           0       0.87      0.74      0.80       267
           1       0.81      0.93      0.87       263
           2       0.79      0.79      0.79       263

    accuracy                           0.82       793
   macro avg       0.82      0.82      0.82       793
weighted avg       0.82      0.82      0.82       793