chinese-roberta-wwm-ext-large-lora-crf-ner
This model is a fine-tuned version of hfl/chinese-roberta-wwm-ext-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7867
- Precision: 0.6482
- Recall: 0.7372
- F1: 0.6898
- Accuracy: 0.9347
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 28
- eval_batch_size: 56
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.7602 | 1.0 | 72 | 0.3759 | 0.4190 | 0.5808 | 0.4868 | 0.9133 |
| 0.3266 | 2.0 | 144 | 0.3221 | 0.5110 | 0.6772 | 0.5825 | 0.9262 |
| 0.263 | 3.0 | 216 | 0.3061 | 0.5373 | 0.6823 | 0.6012 | 0.9308 |
| 0.2355 | 4.0 | 288 | 0.3144 | 0.5385 | 0.6908 | 0.6052 | 0.9277 |
| 0.2042 | 5.0 | 360 | 0.3146 | 0.5690 | 0.7007 | 0.6280 | 0.9320 |
| 0.1856 | 6.0 | 432 | 0.3162 | 0.5676 | 0.6843 | 0.6205 | 0.9300 |
| 0.1644 | 7.0 | 504 | 0.3303 | 0.5810 | 0.7208 | 0.6434 | 0.9336 |
| 0.1536 | 8.0 | 576 | 0.3301 | 0.5851 | 0.7069 | 0.6403 | 0.9337 |
| 0.135 | 9.0 | 648 | 0.3565 | 0.6023 | 0.7072 | 0.6505 | 0.9335 |
| 0.1195 | 10.0 | 720 | 0.3676 | 0.5960 | 0.7276 | 0.6553 | 0.9333 |
| 0.1122 | 11.0 | 792 | 0.3723 | 0.5914 | 0.7256 | 0.6517 | 0.9320 |
| 0.0991 | 12.0 | 864 | 0.3771 | 0.6068 | 0.7115 | 0.6550 | 0.9351 |
| 0.0876 | 13.0 | 936 | 0.3982 | 0.6044 | 0.7132 | 0.6543 | 0.9327 |
| 0.0838 | 14.0 | 1008 | 0.4116 | 0.6081 | 0.7236 | 0.6608 | 0.9345 |
| 0.0786 | 15.0 | 1080 | 0.4065 | 0.6173 | 0.7268 | 0.6676 | 0.9344 |
| 0.0712 | 16.0 | 1152 | 0.4272 | 0.5976 | 0.7155 | 0.6512 | 0.9315 |
| 0.0725 | 17.0 | 1224 | 0.4340 | 0.5970 | 0.7324 | 0.6578 | 0.9308 |
| 0.0695 | 18.0 | 1296 | 0.4482 | 0.6177 | 0.7226 | 0.6660 | 0.9328 |
| 0.0639 | 19.0 | 1368 | 0.4574 | 0.6104 | 0.7251 | 0.6628 | 0.9310 |
| 0.0605 | 20.0 | 1440 | 0.4680 | 0.6105 | 0.7329 | 0.6661 | 0.9309 |
| 0.0556 | 21.0 | 1512 | 0.4534 | 0.6195 | 0.7316 | 0.6709 | 0.9347 |
| 0.049 | 22.0 | 1584 | 0.4726 | 0.6120 | 0.7195 | 0.6614 | 0.9320 |
| 0.0456 | 23.0 | 1656 | 0.4810 | 0.6283 | 0.7281 | 0.6745 | 0.9340 |
| 0.0407 | 24.0 | 1728 | 0.5079 | 0.6373 | 0.7258 | 0.6787 | 0.9332 |
| 0.045 | 25.0 | 1800 | 0.5099 | 0.6133 | 0.7278 | 0.6657 | 0.9322 |
| 0.0376 | 26.0 | 1872 | 0.5292 | 0.6173 | 0.7319 | 0.6697 | 0.9326 |
| 0.0375 | 27.0 | 1944 | 0.5393 | 0.6171 | 0.7248 | 0.6667 | 0.9324 |
| 0.0352 | 28.0 | 2016 | 0.5292 | 0.6091 | 0.7258 | 0.6624 | 0.9322 |
| 0.0339 | 29.0 | 2088 | 0.5431 | 0.6148 | 0.7135 | 0.6605 | 0.9320 |
| 0.0318 | 30.0 | 2160 | 0.5411 | 0.6273 | 0.7213 | 0.6710 | 0.9343 |
| 0.0298 | 31.0 | 2232 | 0.5580 | 0.6227 | 0.7372 | 0.6751 | 0.9316 |
| 0.0301 | 32.0 | 2304 | 0.5587 | 0.6248 | 0.7223 | 0.6700 | 0.9324 |
| 0.0293 | 33.0 | 2376 | 0.5660 | 0.6192 | 0.7213 | 0.6664 | 0.9323 |
| 0.0267 | 34.0 | 2448 | 0.5827 | 0.6202 | 0.7306 | 0.6709 | 0.9318 |
| 0.025 | 35.0 | 2520 | 0.5887 | 0.6241 | 0.7299 | 0.6729 | 0.9323 |
| 0.0239 | 36.0 | 2592 | 0.5861 | 0.6262 | 0.7301 | 0.6742 | 0.9316 |
| 0.0227 | 37.0 | 2664 | 0.6004 | 0.6341 | 0.7341 | 0.6804 | 0.9331 |
| 0.0212 | 38.0 | 2736 | 0.6207 | 0.6353 | 0.7251 | 0.6772 | 0.9331 |
| 0.0198 | 39.0 | 2808 | 0.6226 | 0.6374 | 0.7283 | 0.6798 | 0.9329 |
| 0.0224 | 40.0 | 2880 | 0.6197 | 0.6391 | 0.7299 | 0.6815 | 0.9329 |
| 0.0196 | 41.0 | 2952 | 0.6215 | 0.6438 | 0.7314 | 0.6848 | 0.9334 |
| 0.0221 | 42.0 | 3024 | 0.5998 | 0.6366 | 0.7223 | 0.6767 | 0.9332 |
| 0.0205 | 43.0 | 3096 | 0.6069 | 0.6300 | 0.7203 | 0.6721 | 0.9332 |
| 0.017 | 44.0 | 3168 | 0.6304 | 0.6399 | 0.7261 | 0.6803 | 0.9342 |
| 0.0171 | 45.0 | 3240 | 0.6519 | 0.6370 | 0.7258 | 0.6785 | 0.9327 |
| 0.0167 | 46.0 | 3312 | 0.6418 | 0.6298 | 0.7301 | 0.6762 | 0.9339 |
| 0.0175 | 47.0 | 3384 | 0.6495 | 0.6377 | 0.7304 | 0.6809 | 0.9326 |
| 0.0171 | 48.0 | 3456 | 0.6433 | 0.6399 | 0.7351 | 0.6842 | 0.9342 |
| 0.0146 | 49.0 | 3528 | 0.6498 | 0.6454 | 0.7223 | 0.6817 | 0.9340 |
| 0.0141 | 50.0 | 3600 | 0.6427 | 0.6421 | 0.7228 | 0.6801 | 0.9343 |
| 0.0131 | 51.0 | 3672 | 0.6530 | 0.6308 | 0.7346 | 0.6788 | 0.9327 |
| 0.0136 | 52.0 | 3744 | 0.6545 | 0.6251 | 0.7190 | 0.6688 | 0.9315 |
| 0.0134 | 53.0 | 3816 | 0.6686 | 0.6334 | 0.7273 | 0.6771 | 0.9324 |
| 0.0118 | 54.0 | 3888 | 0.6773 | 0.6353 | 0.7331 | 0.6807 | 0.9336 |
| 0.0108 | 55.0 | 3960 | 0.6751 | 0.6453 | 0.7329 | 0.6863 | 0.9334 |
| 0.0119 | 56.0 | 4032 | 0.6844 | 0.6416 | 0.7296 | 0.6828 | 0.9340 |
| 0.0109 | 57.0 | 4104 | 0.6733 | 0.6351 | 0.7301 | 0.6793 | 0.9341 |
| 0.0102 | 58.0 | 4176 | 0.6876 | 0.6445 | 0.7394 | 0.6887 | 0.9344 |
| 0.0115 | 59.0 | 4248 | 0.6928 | 0.6303 | 0.7321 | 0.6774 | 0.9320 |
| 0.0109 | 60.0 | 4320 | 0.6987 | 0.6300 | 0.7246 | 0.6740 | 0.9332 |
| 0.0099 | 61.0 | 4392 | 0.6952 | 0.6402 | 0.7346 | 0.6842 | 0.9342 |
| 0.0098 | 62.0 | 4464 | 0.7020 | 0.6462 | 0.7445 | 0.6919 | 0.9338 |
| 0.0091 | 63.0 | 4536 | 0.6969 | 0.6464 | 0.7369 | 0.6887 | 0.9342 |
| 0.0082 | 64.0 | 4608 | 0.7141 | 0.6537 | 0.7409 | 0.6946 | 0.9346 |
| 0.0082 | 65.0 | 4680 | 0.7011 | 0.6427 | 0.7306 | 0.6839 | 0.9333 |
| 0.0082 | 66.0 | 4752 | 0.7264 | 0.6494 | 0.7392 | 0.6914 | 0.9339 |
| 0.0075 | 67.0 | 4824 | 0.7010 | 0.6531 | 0.7334 | 0.6909 | 0.9345 |
| 0.0072 | 68.0 | 4896 | 0.7271 | 0.6401 | 0.7349 | 0.6842 | 0.9337 |
| 0.0075 | 69.0 | 4968 | 0.7262 | 0.6471 | 0.7414 | 0.6911 | 0.9336 |
| 0.0071 | 70.0 | 5040 | 0.7196 | 0.6474 | 0.7364 | 0.6890 | 0.9342 |
| 0.008 | 71.0 | 5112 | 0.7103 | 0.6446 | 0.7379 | 0.6881 | 0.9342 |
| 0.0066 | 72.0 | 5184 | 0.7365 | 0.6534 | 0.7417 | 0.6947 | 0.9349 |
| 0.0063 | 73.0 | 5256 | 0.7411 | 0.6444 | 0.7372 | 0.6876 | 0.9341 |
| 0.0064 | 74.0 | 5328 | 0.7270 | 0.6372 | 0.7394 | 0.6845 | 0.9339 |
| 0.0063 | 75.0 | 5400 | 0.7411 | 0.6458 | 0.7399 | 0.6897 | 0.9346 |
| 0.0055 | 76.0 | 5472 | 0.7303 | 0.6449 | 0.7384 | 0.6885 | 0.9344 |
| 0.0053 | 77.0 | 5544 | 0.7524 | 0.6471 | 0.7424 | 0.6915 | 0.9343 |
| 0.0055 | 78.0 | 5616 | 0.7514 | 0.6451 | 0.7397 | 0.6892 | 0.9346 |
| 0.0046 | 79.0 | 5688 | 0.7511 | 0.6504 | 0.7394 | 0.6920 | 0.9349 |
| 0.0046 | 80.0 | 5760 | 0.7644 | 0.6422 | 0.7432 | 0.6890 | 0.9342 |
| 0.0048 | 81.0 | 5832 | 0.7580 | 0.6486 | 0.7435 | 0.6928 | 0.9347 |
| 0.0051 | 82.0 | 5904 | 0.7442 | 0.6455 | 0.7359 | 0.6878 | 0.9344 |
| 0.0046 | 83.0 | 5976 | 0.7594 | 0.6382 | 0.7417 | 0.6861 | 0.9339 |
| 0.0045 | 84.0 | 6048 | 0.7577 | 0.6476 | 0.7389 | 0.6903 | 0.9347 |
| 0.0043 | 85.0 | 6120 | 0.7583 | 0.6515 | 0.7440 | 0.6946 | 0.9350 |
| 0.0041 | 86.0 | 6192 | 0.7596 | 0.6536 | 0.7382 | 0.6933 | 0.9351 |
| 0.0034 | 87.0 | 6264 | 0.7676 | 0.6555 | 0.7412 | 0.6957 | 0.9347 |
| 0.0039 | 88.0 | 6336 | 0.7645 | 0.6520 | 0.7442 | 0.6950 | 0.9352 |
| 0.0044 | 89.0 | 6408 | 0.7652 | 0.6516 | 0.7392 | 0.6926 | 0.9348 |
| 0.0042 | 90.0 | 6480 | 0.7667 | 0.6474 | 0.7379 | 0.6897 | 0.9347 |
| 0.003 | 91.0 | 6552 | 0.7715 | 0.6458 | 0.7387 | 0.6891 | 0.9352 |
| 0.0038 | 92.0 | 6624 | 0.7796 | 0.6462 | 0.7356 | 0.6880 | 0.9351 |
| 0.003 | 93.0 | 6696 | 0.7807 | 0.6546 | 0.7387 | 0.6941 | 0.9350 |
| 0.0028 | 94.0 | 6768 | 0.7829 | 0.6503 | 0.7364 | 0.6907 | 0.9349 |
| 0.0032 | 95.0 | 6840 | 0.7838 | 0.6482 | 0.7412 | 0.6916 | 0.9349 |
| 0.0029 | 96.0 | 6912 | 0.7865 | 0.6468 | 0.7409 | 0.6907 | 0.9349 |
| 0.003 | 97.0 | 6984 | 0.7867 | 0.6470 | 0.7402 | 0.6905 | 0.9350 |
| 0.0028 | 98.0 | 7056 | 0.7878 | 0.6465 | 0.7382 | 0.6893 | 0.9348 |
| 0.003 | 99.0 | 7128 | 0.7874 | 0.6487 | 0.7379 | 0.6905 | 0.9347 |
| 0.0028 | 100.0 | 7200 | 0.7867 | 0.6482 | 0.7372 | 0.6898 | 0.9347 |
Framework versions
- Transformers 4.27.3
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.2
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support