Professor's picture
End of training
c8dcf43
|
raw
history blame
14.1 kB
metadata
license: apache-2.0
base_model: Professor/Plant_Classification_model_vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: Plant_Classification_model_vit-base-patch16-224-in21k
    results: []

Plant_Classification_model_vit-base-patch16-224-in21k

This model is a fine-tuned version of Professor/Plant_Classification_model_vit-base-patch16-224-in21k on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5046
  • Accuracy: 0.7388

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 65 1.7899 0.6140
No log 2.0 130 1.6077 0.6667
No log 3.0 195 1.6408 0.6511
No log 4.0 260 1.6168 0.6569
No log 5.0 325 1.5779 0.6784
No log 6.0 390 1.6319 0.6686
No log 7.0 455 1.5388 0.6940
1.4165 8.0 520 1.7726 0.6121
1.4165 9.0 585 1.6132 0.6764
1.4165 10.0 650 1.6917 0.6550
1.4165 11.0 715 1.6451 0.6647
1.4165 12.0 780 1.6586 0.6706
1.4165 13.0 845 1.5735 0.6998
1.4165 14.0 910 1.6301 0.6667
1.4165 15.0 975 1.5863 0.7018
0.9754 16.0 1040 1.6891 0.6608
0.9754 17.0 1105 1.6232 0.6823
0.9754 18.0 1170 1.6181 0.6940
0.9754 19.0 1235 1.6600 0.6803
0.9754 20.0 1300 1.6032 0.6823
0.9754 21.0 1365 1.5761 0.6901
0.9754 22.0 1430 1.5992 0.6998
0.9754 23.0 1495 1.7566 0.6335
0.8739 24.0 1560 1.6459 0.6706
0.8739 25.0 1625 1.7036 0.6530
0.8739 26.0 1690 1.6099 0.6998
0.8739 27.0 1755 1.5543 0.7154
0.8739 28.0 1820 1.6657 0.6686
0.8739 29.0 1885 1.6664 0.6764
0.8739 30.0 1950 1.7176 0.6589
0.8336 31.0 2015 1.7594 0.6374
0.8336 32.0 2080 1.7652 0.6394
0.8336 33.0 2145 1.6813 0.6706
0.8336 34.0 2210 1.6347 0.6823
0.8336 35.0 2275 1.6370 0.6959
0.8336 36.0 2340 1.5888 0.7096
0.8336 37.0 2405 1.6533 0.6901
0.8336 38.0 2470 1.7444 0.6550
0.8164 39.0 2535 1.6537 0.6784
0.8164 40.0 2600 1.6352 0.6940
0.8164 41.0 2665 1.6136 0.7037
0.8164 42.0 2730 1.6196 0.6979
0.8164 43.0 2795 1.6115 0.6901
0.8164 44.0 2860 1.6563 0.6842
0.8164 45.0 2925 1.6126 0.7018
0.8164 46.0 2990 1.6695 0.6745
0.8011 47.0 3055 1.6163 0.6881
0.8011 48.0 3120 1.6003 0.7076
0.8011 49.0 3185 1.6851 0.6647
0.8011 50.0 3250 1.6148 0.7018
0.8011 51.0 3315 1.6130 0.6920
0.8011 52.0 3380 1.6680 0.6784
0.8011 53.0 3445 1.6761 0.6823
0.8089 54.0 3510 1.7472 0.6413
0.8089 55.0 3575 1.7329 0.6491
0.8089 56.0 3640 1.7198 0.6608
0.8089 57.0 3705 1.6746 0.6842
0.8089 58.0 3770 1.5865 0.6979
0.8089 59.0 3835 1.5857 0.7096
0.8089 60.0 3900 1.6424 0.6940
0.8089 61.0 3965 1.6277 0.6959
0.793 62.0 4030 1.6282 0.6920
0.793 63.0 4095 1.6081 0.7037
0.793 64.0 4160 1.5735 0.7173
0.793 65.0 4225 1.5931 0.7115
0.793 66.0 4290 1.5654 0.7115
0.793 67.0 4355 1.6955 0.6764
0.793 68.0 4420 1.6564 0.6862
0.793 69.0 4485 1.6102 0.7018
0.7796 70.0 4550 1.6658 0.6784
0.7796 71.0 4615 1.6572 0.6745
0.7796 72.0 4680 1.5700 0.7115
0.7796 73.0 4745 1.6323 0.6823
0.7796 74.0 4810 1.6547 0.6803
0.7796 75.0 4875 1.5867 0.7096
0.7796 76.0 4940 1.6341 0.6940
0.7821 77.0 5005 1.5601 0.7212
0.7821 78.0 5070 1.5888 0.7154
0.7821 79.0 5135 1.6283 0.6998
0.7821 80.0 5200 1.5526 0.7173
0.7821 81.0 5265 1.6215 0.6940
0.7821 82.0 5330 1.6607 0.6862
0.7821 83.0 5395 1.6475 0.6823
0.7821 84.0 5460 1.6054 0.7096
0.7749 85.0 5525 1.6370 0.6959
0.7749 86.0 5590 1.6143 0.6998
0.7749 87.0 5655 1.6204 0.6920
0.7749 88.0 5720 1.6927 0.6803
0.7749 89.0 5785 1.6856 0.6784
0.7749 90.0 5850 1.6473 0.6959
0.7749 91.0 5915 1.6356 0.6940
0.7749 92.0 5980 1.6449 0.6881
0.7741 93.0 6045 1.6522 0.6940
0.7741 94.0 6110 1.7063 0.6686
0.7741 95.0 6175 1.7064 0.6803
0.7741 96.0 6240 1.6437 0.6998
0.7741 97.0 6305 1.6400 0.6881
0.7741 98.0 6370 1.6907 0.6706
0.7741 99.0 6435 1.6791 0.6803
0.7721 100.0 6500 1.6895 0.6803
0.7721 101.0 6565 1.6132 0.6998
0.7721 102.0 6630 1.6058 0.7037
0.7721 103.0 6695 1.6045 0.7154
0.7721 104.0 6760 1.5828 0.7173
0.7721 105.0 6825 1.6337 0.6920
0.7721 106.0 6890 1.6262 0.6979
0.7721 107.0 6955 1.6166 0.6979
0.7576 108.0 7020 1.5869 0.7154
0.7576 109.0 7085 1.6349 0.6959
0.7576 110.0 7150 1.5977 0.6998
0.7576 111.0 7215 1.6054 0.7018
0.7576 112.0 7280 1.6029 0.6979
0.7576 113.0 7345 1.6131 0.6959
0.7576 114.0 7410 1.6488 0.6803
0.7576 115.0 7475 1.6223 0.6979
0.7568 116.0 7540 1.6581 0.6745
0.7568 117.0 7605 1.5714 0.7115
0.7568 118.0 7670 1.5629 0.7193
0.7568 119.0 7735 1.5841 0.7018
0.7568 120.0 7800 1.6263 0.6881
0.7568 121.0 7865 1.5982 0.7018
0.7568 122.0 7930 1.6473 0.6881
0.7568 123.0 7995 1.6026 0.7018
0.7531 124.0 8060 1.5897 0.7057
0.7531 125.0 8125 1.6494 0.6784
0.7531 126.0 8190 1.5711 0.7173
0.7531 127.0 8255 1.5992 0.7076
0.7531 128.0 8320 1.5605 0.7154
0.7531 129.0 8385 1.5683 0.7115
0.7531 130.0 8450 1.6044 0.7096
0.7492 131.0 8515 1.5654 0.7232
0.7492 132.0 8580 1.6027 0.7057
0.7492 133.0 8645 1.6116 0.6998
0.7492 134.0 8710 1.6030 0.7057
0.7492 135.0 8775 1.6039 0.7018
0.7492 136.0 8840 1.5695 0.7154
0.7492 137.0 8905 1.6011 0.7018
0.7492 138.0 8970 1.6054 0.6959
0.7457 139.0 9035 1.6654 0.6842
0.7457 140.0 9100 1.6112 0.7076
0.7457 141.0 9165 1.5575 0.7251
0.7457 142.0 9230 1.5775 0.7173
0.7457 143.0 9295 1.5442 0.7310
0.7457 144.0 9360 1.5465 0.7232
0.7457 145.0 9425 1.5561 0.7251
0.7457 146.0 9490 1.5677 0.7193
0.7453 147.0 9555 1.5790 0.7135
0.7453 148.0 9620 1.5981 0.7057
0.7453 149.0 9685 1.5510 0.7232
0.7453 150.0 9750 1.5833 0.7154
0.7453 151.0 9815 1.5798 0.7212
0.7453 152.0 9880 1.5525 0.7232
0.7453 153.0 9945 1.5433 0.7212
0.7416 154.0 10010 1.5549 0.7232
0.7416 155.0 10075 1.5626 0.7173
0.7416 156.0 10140 1.5673 0.7212
0.7416 157.0 10205 1.5482 0.7290
0.7416 158.0 10270 1.5333 0.7290
0.7416 159.0 10335 1.5338 0.7271
0.7416 160.0 10400 1.5480 0.7193
0.7416 161.0 10465 1.5431 0.7271
0.7414 162.0 10530 1.5469 0.7251
0.7414 163.0 10595 1.5504 0.7232
0.7414 164.0 10660 1.5850 0.7173
0.7414 165.0 10725 1.5352 0.7251
0.7414 166.0 10790 1.5480 0.7290
0.7414 167.0 10855 1.5736 0.7173
0.7414 168.0 10920 1.5507 0.7251
0.7414 169.0 10985 1.5347 0.7251
0.7411 170.0 11050 1.5355 0.7349
0.7411 171.0 11115 1.5273 0.7388
0.7411 172.0 11180 1.5286 0.7349
0.7411 173.0 11245 1.5664 0.7212
0.7411 174.0 11310 1.5340 0.7310
0.7411 175.0 11375 1.5311 0.7388
0.7411 176.0 11440 1.5405 0.7329
0.7395 177.0 11505 1.5261 0.7388
0.7395 178.0 11570 1.5289 0.7388
0.7395 179.0 11635 1.5303 0.7368
0.7395 180.0 11700 1.5323 0.7349
0.7395 181.0 11765 1.5374 0.7310
0.7395 182.0 11830 1.5357 0.7349
0.7395 183.0 11895 1.5442 0.7310
0.7395 184.0 11960 1.5453 0.7310
0.7396 185.0 12025 1.5130 0.7388
0.7396 186.0 12090 1.5179 0.7407
0.7396 187.0 12155 1.5100 0.7427
0.7396 188.0 12220 1.5046 0.7388
0.7396 189.0 12285 1.5058 0.7407
0.7396 190.0 12350 1.5078 0.7368
0.7396 191.0 12415 1.5088 0.7427
0.7396 192.0 12480 1.5079 0.7407
0.7375 193.0 12545 1.5072 0.7446
0.7375 194.0 12610 1.5051 0.7446
0.7375 195.0 12675 1.5094 0.7388
0.7375 196.0 12740 1.5074 0.7427
0.7375 197.0 12805 1.5067 0.7427
0.7375 198.0 12870 1.5097 0.7388
0.7375 199.0 12935 1.5095 0.7388
0.7373 200.0 13000 1.5102 0.7388

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.0.0
  • Datasets 2.15.0
  • Tokenizers 0.15.0