You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

xlm-roberta-toxicity-classification-ptp-all-languages-7

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4031
  • Qwk: 0.9304
  • Mae: 0.1656
  • Accuracy: 0.8378
  • Arabic Qwk: 0.9486
  • Arabic Mae: 0.128
  • Arabic Accuracy: 0.872
  • Chinese Qwk: 0.9065
  • Chinese Mae: 0.224
  • Chinese Accuracy: 0.778
  • Czech Qwk: 0.9538
  • Czech Mae: 0.106
  • Czech Accuracy: 0.896
  • Dutch Qwk: 0.9489
  • Dutch Mae: 0.122
  • Dutch Accuracy: 0.88
  • English Qwk: 0.9466
  • English Mae: 0.13
  • English Accuracy: 0.872
  • French Qwk: 0.9502
  • French Mae: 0.118
  • French Accuracy: 0.884
  • German Qwk: 0.8524
  • German Mae: 0.334
  • German Accuracy: 0.684
  • Hindi Qwk: 0.9335
  • Hindi Mae: 0.156
  • Hindi Accuracy: 0.846
  • Indonesian Qwk: 0.9476
  • Indonesian Mae: 0.13
  • Indonesian Accuracy: 0.87
  • Italian Qwk: 0.9673
  • Italian Mae: 0.078
  • Italian Accuracy: 0.922
  • Japanese Qwk: 0.9166
  • Japanese Mae: 0.202
  • Japanese Accuracy: 0.8
  • Korean Qwk: 0.9447
  • Korean Mae: 0.134
  • Korean Accuracy: 0.866
  • Polish Qwk: 0.9473
  • Polish Mae: 0.128
  • Polish Accuracy: 0.872
  • Portuguese Qwk: 0.8567
  • Portuguese Mae: 0.324
  • Portuguese Accuracy: 0.696
  • Russian Qwk: 0.8930
  • Russian Mae: 0.262
  • Russian Accuracy: 0.742
  • Spanish Qwk: 0.9375
  • Spanish Mae: 0.152
  • Spanish Accuracy: 0.85
  • Swedish Qwk: 0.9452
  • Swedish Mae: 0.13
  • Swedish Accuracy: 0.872
  • Sanity Check Qwk: nan
  • Sanity Check Mae: 0.0
  • Sanity Check Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Arabic Accuracy Arabic Mae Arabic Qwk Chinese Accuracy Chinese Mae Chinese Qwk Czech Accuracy Czech Mae Czech Qwk Dutch Accuracy Dutch Mae Dutch Qwk English Accuracy English Mae English Qwk French Accuracy French Mae French Qwk German Accuracy German Mae German Qwk Hindi Accuracy Hindi Mae Hindi Qwk Indonesian Accuracy Indonesian Mae Indonesian Qwk Italian Accuracy Italian Mae Italian Qwk Japanese Accuracy Japanese Mae Japanese Qwk Korean Accuracy Korean Mae Korean Qwk Polish Accuracy Polish Mae Polish Qwk Portuguese Accuracy Portuguese Mae Portuguese Qwk Russian Accuracy Russian Mae Russian Qwk Spanish Accuracy Spanish Mae Spanish Qwk Swedish Accuracy Swedish Mae Swedish Qwk Accuracy Validation Loss Mae Qwk Sanity Check Accuracy Sanity Check Mae Sanity Check Qwk
0.6439 0.0268 500 0.68 0.332 0.8688 0.672 0.364 0.8316 0.756 0.266 0.8816 0.736 0.266 0.9029 0.758 0.246 0.8965 0.726 0.274 0.9033 0.572 0.464 0.7697 0.728 0.288 0.8731 0.746 0.258 0.8983 0.742 0.268 0.8843 0.65 0.366 0.8254 0.742 0.266 0.8900 0.75 0.268 0.8854 0.59 0.46 0.7764 0.624 0.418 0.8133 0.714 0.31 0.8587 0.784 0.226 0.9006 0.7079 0.6977 0.3101 0.8660 0.9606 0.0394 0.0
0.7334 0.0536 1000 0.758 0.244 0.9048 0.706 0.302 0.8598 0.792 0.212 0.9045 0.774 0.228 0.9160 0.802 0.202 0.9196 0.782 0.222 0.9155 0.622 0.42 0.8002 0.764 0.246 0.8908 0.77 0.23 0.9114 0.808 0.196 0.9168 0.704 0.308 0.8621 0.776 0.226 0.9068 0.786 0.224 0.9053 0.636 0.406 0.7976 0.658 0.374 0.8189 0.78 0.236 0.8936 0.804 0.2 0.9170 0.7506 0.5961 0.2609 0.8873 0.8976 0.1024 0.0
0.603 0.0804 1500 0.788 0.218 0.8993 0.7 0.31 0.8535 0.758 0.246 0.8777 0.81 0.194 0.9139 0.826 0.178 0.9218 0.822 0.178 0.9194 0.596 0.438 0.7888 0.766 0.24 0.8893 0.796 0.204 0.9140 0.814 0.188 0.9190 0.67 0.358 0.8036 0.754 0.248 0.8914 0.768 0.236 0.8918 0.562 0.484 0.7532 0.664 0.364 0.8140 0.748 0.264 0.8693 0.806 0.2 0.9105 0.7475 0.5850 0.2638 0.8752 0.9843 0.0157 0.0
0.5844 0.1072 2000 0.784 0.216 0.9134 0.692 0.314 0.8568 0.794 0.208 0.9004 0.82 0.186 0.9143 0.808 0.194 0.9198 0.844 0.156 0.9342 0.676 0.36 0.8291 0.794 0.214 0.8992 0.824 0.176 0.9263 0.864 0.138 0.9409 0.724 0.284 0.8736 0.838 0.164 0.9297 0.788 0.212 0.9082 0.66 0.36 0.8224 0.662 0.348 0.8476 0.758 0.246 0.8840 0.854 0.148 0.9372 0.7785 0.5209 0.2278 0.8988 0.9764 0.0236 0.0
0.6695 0.1340 2500 0.772 0.23 0.9084 0.704 0.306 0.8658 0.83 0.172 0.9257 0.824 0.178 0.9303 0.824 0.178 0.9271 0.834 0.166 0.9290 0.656 0.37 0.8270 0.772 0.234 0.8929 0.84 0.16 0.9350 0.872 0.13 0.9453 0.694 0.316 0.8661 0.83 0.172 0.9281 0.822 0.18 0.9260 0.682 0.334 0.8499 0.68 0.334 0.8444 0.796 0.21 0.9079 0.86 0.144 0.9381 0.7849 0.5210 0.2213 0.9051 0.9843 0.0157 0.0
0.7138 0.1609 3000 0.76 0.242 0.8975 0.69 0.318 0.8476 0.808 0.194 0.9140 0.812 0.19 0.9226 0.76 0.242 0.9012 0.834 0.166 0.9302 0.608 0.428 0.8081 0.768 0.238 0.8901 0.832 0.168 0.9283 0.834 0.166 0.9306 0.672 0.336 0.8425 0.788 0.216 0.9095 0.808 0.194 0.9174 0.564 0.494 0.7738 0.632 0.374 0.8189 0.738 0.268 0.8810 0.812 0.192 0.9158 0.7515 0.5684 0.2571 0.8861 0.9685 0.0394 0.0
0.5483 0.1877 3500 0.808 0.194 0.9207 0.736 0.266 0.8886 0.848 0.154 0.9339 0.838 0.168 0.9260 0.844 0.158 0.9351 0.854 0.148 0.9371 0.624 0.41 0.8220 0.796 0.208 0.9071 0.848 0.152 0.9382 0.896 0.106 0.9545 0.766 0.244 0.8885 0.816 0.186 0.9225 0.83 0.174 0.9268 0.668 0.358 0.8396 0.7 0.308 0.8649 0.806 0.198 0.9160 0.858 0.144 0.9397 0.7992 0.4820 0.2073 0.9109 1.0 0.0 nan
0.5865 0.2145 4000 0.804 0.196 0.9186 0.74 0.264 0.8873 0.842 0.16 0.9295 0.824 0.182 0.9231 0.806 0.194 0.9222 0.846 0.154 0.9354 0.654 0.372 0.8351 0.798 0.206 0.9071 0.824 0.178 0.9262 0.84 0.16 0.9336 0.74 0.268 0.8869 0.818 0.184 0.9223 0.832 0.17 0.9297 0.614 0.42 0.8134 0.674 0.332 0.8617 0.76 0.246 0.8996 0.86 0.142 0.9400 0.7838 0.5047 0.2223 0.9055 0.9764 0.0315 0.0
0.5605 0.2413 4500 0.8 0.2 0.9207 0.712 0.298 0.8765 0.816 0.186 0.9211 0.836 0.17 0.9295 0.806 0.194 0.9264 0.852 0.148 0.9377 0.664 0.36 0.8330 0.748 0.256 0.8967 0.84 0.164 0.9317 0.898 0.102 0.9580 0.67 0.342 0.8681 0.81 0.19 0.9243 0.812 0.19 0.9235 0.692 0.332 0.8490 0.672 0.342 0.8585 0.794 0.212 0.9127 0.84 0.164 0.9308 0.7834 0.5189 0.2231 0.9074 1.0 0.0 nan
0.4937 0.2681 5000 0.82 0.18 0.9302 0.766 0.238 0.8976 0.858 0.144 0.9369 0.808 0.198 0.9212 0.844 0.156 0.9389 0.85 0.15 0.9405 0.662 0.362 0.8278 0.824 0.178 0.9274 0.83 0.17 0.9327 0.872 0.128 0.9475 0.774 0.236 0.8992 0.824 0.176 0.9287 0.816 0.186 0.9253 0.668 0.36 0.8339 0.686 0.324 0.8640 0.798 0.208 0.9077 0.864 0.138 0.9423 0.8006 0.4932 0.2049 0.9139 0.9843 0.0157 0.0
0.5286 0.2949 5500 0.826 0.174 0.9251 0.704 0.3 0.8662 0.86 0.142 0.9374 0.848 0.152 0.9348 0.87 0.13 0.9449 0.852 0.148 0.9347 0.64 0.372 0.8239 0.812 0.19 0.9150 0.85 0.15 0.9379 0.896 0.104 0.9558 0.758 0.248 0.8894 0.842 0.158 0.9340 0.812 0.19 0.9204 0.718 0.304 0.8506 0.688 0.316 0.8579 0.81 0.196 0.9045 0.872 0.13 0.9444 0.8058 0.4726 0.1978 0.9128 0.9685 0.0315 0.0
0.4297 0.3217 6000 0.828 0.172 0.9321 0.728 0.278 0.8788 0.836 0.164 0.9302 0.846 0.16 0.9321 0.844 0.16 0.9298 0.86 0.142 0.9387 0.638 0.38 0.8372 0.814 0.188 0.9190 0.848 0.152 0.9383 0.878 0.122 0.9494 0.732 0.278 0.8760 0.8 0.2 0.9173 0.812 0.19 0.9222 0.636 0.388 0.8279 0.686 0.326 0.8561 0.78 0.228 0.8968 0.842 0.16 0.9327 0.7918 0.5110 0.2137 0.9085 1.0 0.0 nan
0.4847 0.3485 6500 0.824 0.176 0.9277 0.746 0.26 0.8891 0.854 0.148 0.9358 0.85 0.152 0.9358 0.854 0.146 0.9414 0.85 0.15 0.9389 0.688 0.332 0.8418 0.834 0.168 0.9275 0.84 0.16 0.9350 0.898 0.104 0.9550 0.772 0.236 0.8996 0.846 0.156 0.9345 0.842 0.16 0.9337 0.712 0.312 0.8490 0.702 0.304 0.8701 0.816 0.19 0.9171 0.866 0.136 0.9430 0.8142 0.4566 0.1907 0.9182 1.0 0.0 nan
0.4536 0.3753 7000 0.812 0.188 0.9268 0.758 0.244 0.8986 0.878 0.126 0.9450 0.822 0.182 0.9247 0.852 0.148 0.9412 0.87 0.13 0.9478 0.672 0.35 0.8434 0.804 0.198 0.9183 0.862 0.138 0.9447 0.904 0.096 0.9598 0.748 0.26 0.8869 0.84 0.16 0.9338 0.812 0.192 0.9217 0.668 0.356 0.8396 0.712 0.292 0.8794 0.798 0.208 0.9061 0.86 0.142 0.9407 0.8070 0.4731 0.1978 0.9169 0.9921 0.0079 0.0
0.454 0.4021 7500 0.808 0.192 0.9256 0.762 0.242 0.8962 0.87 0.132 0.9413 0.852 0.15 0.9372 0.826 0.174 0.9330 0.86 0.14 0.9435 0.66 0.362 0.8428 0.826 0.178 0.9219 0.84 0.162 0.9331 0.902 0.1 0.9564 0.772 0.23 0.8967 0.848 0.154 0.9349 0.85 0.152 0.9363 0.658 0.372 0.8377 0.694 0.314 0.8612 0.822 0.182 0.9240 0.85 0.154 0.9336 0.8086 0.4703 0.1966 0.9164 0.9921 0.0079 0.0
0.4405 0.4290 8000 0.838 0.162 0.9332 0.74 0.264 0.8848 0.864 0.138 0.9389 0.86 0.142 0.9404 0.862 0.14 0.9416 0.88 0.12 0.9483 0.642 0.38 0.8329 0.806 0.196 0.9123 0.868 0.132 0.9461 0.904 0.096 0.9590 0.776 0.23 0.8960 0.846 0.154 0.9353 0.854 0.146 0.9378 0.67 0.358 0.8404 0.722 0.284 0.8747 0.824 0.18 0.9184 0.866 0.136 0.9423 0.8158 0.4390 0.1888 0.9179 1.0 0.0 nan
0.5102 0.4558 8500 0.844 0.156 0.9365 0.748 0.256 0.8908 0.892 0.11 0.9513 0.87 0.13 0.9452 0.864 0.136 0.9449 0.872 0.13 0.9423 0.674 0.344 0.8435 0.816 0.186 0.9210 0.858 0.142 0.9414 0.888 0.112 0.9529 0.788 0.216 0.9051 0.844 0.156 0.9344 0.852 0.15 0.9375 0.672 0.35 0.8416 0.706 0.302 0.8704 0.85 0.152 0.9354 0.882 0.12 0.9486 0.8215 0.4266 0.1825 0.9217 1.0 0.0 nan
0.4028 0.4826 9000 0.848 0.152 0.9377 0.778 0.224 0.9030 0.882 0.12 0.9467 0.872 0.13 0.9434 0.86 0.14 0.9435 0.884 0.116 0.9504 0.666 0.35 0.8430 0.836 0.166 0.9258 0.868 0.134 0.9438 0.894 0.108 0.9527 0.766 0.238 0.8959 0.856 0.144 0.9397 0.856 0.144 0.9407 0.634 0.4 0.8245 0.702 0.302 0.8660 0.838 0.168 0.9235 0.87 0.134 0.9413 0.8209 0.4372 0.1837 0.9201 1.0 0.0 nan
0.4185 0.5094 9500 0.838 0.162 0.9332 0.778 0.224 0.9030 0.892 0.11 0.9522 0.874 0.13 0.9428 0.848 0.154 0.9349 0.88 0.12 0.9503 0.644 0.382 0.8325 0.84 0.162 0.9300 0.862 0.138 0.9427 0.892 0.108 0.9549 0.786 0.216 0.9058 0.854 0.146 0.9397 0.824 0.178 0.9256 0.66 0.37 0.8325 0.71 0.296 0.8761 0.832 0.172 0.9267 0.882 0.118 0.9509 0.8200 0.4321 0.1848 0.9206 0.9921 0.0079 0.0
0.5041 0.5362 10000 0.842 0.158 0.9321 0.758 0.244 0.8954 0.87 0.134 0.9409 0.868 0.134 0.9434 0.866 0.134 0.9460 0.882 0.12 0.9456 0.666 0.36 0.8413 0.838 0.164 0.9275 0.86 0.14 0.9403 0.896 0.106 0.9537 0.798 0.206 0.9063 0.844 0.156 0.9345 0.846 0.154 0.9366 0.7 0.318 0.8535 0.716 0.29 0.8743 0.84 0.164 0.9259 0.876 0.126 0.9462 0.8242 0.4300 0.1801 0.9215 1.0 0.0 nan
0.4305 0.5630 10500 0.848 0.152 0.9386 0.778 0.224 0.9047 0.898 0.104 0.9542 0.864 0.14 0.9411 0.866 0.136 0.9444 0.894 0.106 0.9556 0.668 0.354 0.8446 0.842 0.16 0.9321 0.864 0.136 0.9446 0.908 0.092 0.9615 0.794 0.212 0.9094 0.854 0.146 0.9396 0.842 0.158 0.9363 0.688 0.328 0.8584 0.714 0.29 0.8810 0.846 0.158 0.9324 0.87 0.13 0.9466 0.8283 0.4182 0.1754 0.9262 1.0 0.0 nan
0.3719 0.5898 11000 0.862 0.138 0.9446 0.756 0.248 0.8940 0.87 0.134 0.9412 0.874 0.128 0.9463 0.85 0.152 0.9394 0.88 0.12 0.9510 0.672 0.35 0.8453 0.844 0.158 0.9334 0.858 0.142 0.9430 0.902 0.098 0.9592 0.798 0.206 0.9122 0.86 0.14 0.9421 0.844 0.156 0.9373 0.688 0.332 0.8538 0.712 0.294 0.8766 0.848 0.154 0.9367 0.868 0.132 0.9459 0.8253 0.4201 0.1786 0.9251 1.0 0.0 nan
0.4213 0.6166 11500 0.844 0.156 0.9395 0.766 0.236 0.9006 0.892 0.112 0.9499 0.86 0.144 0.9406 0.86 0.14 0.9450 0.85 0.15 0.9408 0.67 0.35 0.8466 0.848 0.154 0.9347 0.864 0.136 0.9452 0.906 0.094 0.9608 0.742 0.264 0.8926 0.856 0.144 0.9407 0.832 0.168 0.9322 0.692 0.324 0.8588 0.724 0.282 0.8848 0.832 0.172 0.9290 0.868 0.134 0.9439 0.8207 0.4344 0.1831 0.9241 1.0 0.0 nan
0.5293 0.6434 12000 0.856 0.144 0.9430 0.768 0.234 0.8988 0.896 0.106 0.9535 0.854 0.148 0.9395 0.844 0.158 0.9366 0.85 0.15 0.9402 0.686 0.33 0.8494 0.832 0.17 0.9270 0.85 0.15 0.9398 0.906 0.094 0.9606 0.78 0.222 0.9053 0.86 0.14 0.9422 0.85 0.15 0.9377 0.688 0.334 0.8481 0.718 0.286 0.8809 0.834 0.168 0.9309 0.868 0.132 0.9459 0.8226 0.4226 0.1806 0.9242 1.0 0.0 nan
0.3045 0.6702 12500 0.868 0.132 0.9466 0.762 0.24 0.8981 0.884 0.118 0.9492 0.878 0.124 0.9482 0.866 0.136 0.9443 0.888 0.112 0.9533 0.69 0.328 0.8513 0.854 0.148 0.9372 0.864 0.136 0.9440 0.916 0.084 0.9649 0.772 0.232 0.9028 0.866 0.134 0.9448 0.852 0.148 0.9396 0.712 0.304 0.8615 0.732 0.274 0.8845 0.852 0.15 0.9380 0.868 0.134 0.9433 0.8333 0.4033 0.1700 0.9282 1.0 0.0 nan
0.3547 0.6971 13000 0.86 0.14 0.9445 0.778 0.224 0.9044 0.886 0.116 0.9499 0.876 0.126 0.9476 0.862 0.14 0.9432 0.872 0.13 0.9456 0.68 0.338 0.8525 0.846 0.156 0.9325 0.864 0.136 0.9450 0.912 0.088 0.9632 0.794 0.21 0.9110 0.862 0.138 0.9436 0.854 0.146 0.9398 0.668 0.354 0.8482 0.734 0.27 0.8863 0.852 0.15 0.9381 0.876 0.126 0.9469 0.8305 0.4236 0.1732 0.9273 1.0 0.0 nan
0.4719 0.7239 13500 0.856 0.144 0.9430 0.762 0.24 0.8988 0.884 0.12 0.9468 0.872 0.13 0.9462 0.844 0.158 0.9363 0.878 0.122 0.9501 0.676 0.342 0.8480 0.842 0.16 0.9309 0.858 0.142 0.9429 0.9 0.1 0.9582 0.776 0.23 0.9034 0.864 0.136 0.9440 0.84 0.16 0.9355 0.694 0.326 0.8566 0.724 0.28 0.8860 0.846 0.156 0.9360 0.866 0.136 0.9430 0.8251 0.4282 0.1786 0.9253 1.0 0.0 nan
0.3304 0.7507 14000 0.874 0.126 0.9500 0.78 0.222 0.9062 0.908 0.094 0.9585 0.868 0.134 0.9448 0.854 0.146 0.9430 0.892 0.11 0.9530 0.692 0.324 0.8582 0.856 0.146 0.9371 0.874 0.126 0.9493 0.914 0.086 0.9642 0.776 0.226 0.9053 0.854 0.146 0.9404 0.852 0.148 0.9390 0.682 0.338 0.8531 0.718 0.288 0.8796 0.834 0.168 0.9307 0.88 0.122 0.9482 0.8324 0.4153 0.1710 0.9283 1.0 0.0 nan
0.4641 0.7775 14500 0.874 0.126 0.9490 0.788 0.214 0.9092 0.904 0.098 0.9569 0.876 0.126 0.9472 0.878 0.122 0.9513 0.898 0.104 0.9554 0.694 0.322 0.8567 0.85 0.152 0.9346 0.872 0.128 0.9481 0.918 0.082 0.9655 0.782 0.22 0.9090 0.866 0.134 0.9448 0.856 0.144 0.9409 0.708 0.31 0.8619 0.73 0.274 0.8878 0.848 0.154 0.9369 0.88 0.122 0.9482 0.8390 0.4056 0.1641 0.9309 1.0 0.0 nan
0.367 0.8043 15000 0.87 0.13 0.9473 0.778 0.224 0.9056 0.89 0.112 0.9514 0.874 0.128 0.9464 0.87 0.13 0.9484 0.898 0.104 0.9555 0.692 0.324 0.8567 0.844 0.158 0.9333 0.864 0.136 0.9450 0.918 0.082 0.9658 0.792 0.212 0.9121 0.866 0.134 0.9448 0.848 0.152 0.9385 0.7 0.32 0.8586 0.726 0.28 0.8834 0.848 0.154 0.9368 0.872 0.13 0.9450 0.8348 0.4090 0.1687 0.9292 1.0 0.0 nan
0.4378 0.8311 15500 0.874 0.126 0.9489 0.784 0.218 0.9086 0.896 0.106 0.9537 0.884 0.118 0.9503 0.878 0.124 0.9490 0.9 0.102 0.9561 0.69 0.326 0.8541 0.854 0.148 0.9368 0.87 0.13 0.9474 0.914 0.086 0.9640 0.798 0.204 0.9155 0.868 0.132 0.9458 0.852 0.148 0.9398 0.682 0.336 0.8558 0.736 0.268 0.8895 0.852 0.15 0.9384 0.88 0.122 0.9482 0.8384 0.4057 0.1648 0.9308 1.0 0.0 nan
0.4411 0.8579 16000 0.87 0.13 0.9479 0.778 0.224 0.9065 0.896 0.104 0.9562 0.882 0.12 0.9495 0.87 0.132 0.9462 0.894 0.108 0.9536 0.686 0.334 0.8512 0.854 0.148 0.9370 0.87 0.13 0.9477 0.922 0.078 0.9674 0.8 0.202 0.9166 0.862 0.138 0.9431 0.862 0.138 0.9437 0.692 0.326 0.8593 0.73 0.276 0.8863 0.85 0.152 0.9377 0.878 0.126 0.9452 0.8375 0.4046 0.1661 0.9302 1.0 0.0 nan
0.3288 0.8847 16500 0.4008 0.9313 0.1626 0.8407 0.9499 0.124 0.876 0.9055 0.226 0.776 0.9600 0.094 0.906 0.9502 0.118 0.884 0.9479 0.126 0.876 0.9544 0.106 0.896 0.8514 0.334 0.684 0.9371 0.146 0.856 0.9466 0.132 0.868 0.9682 0.076 0.924 0.9203 0.19 0.812 0.9439 0.136 0.864 0.9449 0.134 0.866 0.8572 0.326 0.692 0.8886 0.266 0.74 0.9380 0.15 0.852 0.9481 0.122 0.88 nan 0.0 1.0
0.3776 0.9115 17000 0.4075 0.9301 0.1667 0.8368 0.9463 0.134 0.866 0.9068 0.224 0.778 0.9538 0.106 0.896 0.9496 0.12 0.882 0.9452 0.134 0.868 0.9518 0.114 0.888 0.8512 0.338 0.68 0.9353 0.152 0.85 0.9468 0.132 0.868 0.9658 0.082 0.918 0.9151 0.206 0.796 0.9467 0.13 0.87 0.9427 0.14 0.86 0.8564 0.33 0.69 0.8927 0.26 0.744 0.9374 0.152 0.85 0.9484 0.122 0.88 nan 0.0 1.0
0.2721 0.9383 17500 0.4012 0.9303 0.1658 0.8375 0.9491 0.126 0.874 0.9044 0.228 0.774 0.9585 0.098 0.902 0.9485 0.122 0.88 0.9473 0.128 0.874 0.9537 0.108 0.894 0.8528 0.334 0.684 0.9347 0.152 0.85 0.9468 0.132 0.868 0.9665 0.08 0.92 0.9155 0.204 0.798 0.9457 0.132 0.868 0.9408 0.144 0.856 0.8552 0.334 0.684 0.8919 0.264 0.74 0.9366 0.154 0.848 0.9491 0.12 0.882 nan 0.0 1.0
0.4061 0.9651 18000 0.3980 0.9307 0.1647 0.8385 0.9475 0.13 0.87 0.9051 0.226 0.776 0.9543 0.104 0.898 0.9503 0.118 0.884 0.9464 0.13 0.872 0.9539 0.108 0.894 0.8560 0.326 0.692 0.9338 0.154 0.848 0.9459 0.134 0.866 0.9665 0.08 0.92 0.9185 0.196 0.806 0.9430 0.138 0.862 0.9464 0.13 0.87 0.8587 0.328 0.688 0.8918 0.264 0.74 0.9360 0.156 0.846 0.9490 0.12 0.882 nan 0.0 1.0
0.3829 0.9920 18500 0.4031 0.9304 0.1656 0.8378 0.9486 0.128 0.872 0.9065 0.224 0.778 0.9538 0.106 0.896 0.9489 0.122 0.88 0.9466 0.13 0.872 0.9502 0.118 0.884 0.8524 0.334 0.684 0.9335 0.156 0.846 0.9476 0.13 0.87 0.9673 0.078 0.922 0.9166 0.202 0.8 0.9447 0.134 0.866 0.9473 0.128 0.872 0.8567 0.324 0.696 0.8930 0.262 0.742 0.9375 0.152 0.85 0.9452 0.13 0.872 nan 0.0 1.0

Framework versions

  • Transformers 4.50.0
  • Pytorch 2.3.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.21.1
Downloads last month
-
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support