yolo_finetuned_fruits
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8611
- Map: 0.5862
- Map 50: 0.8559
- Map 75: 0.6628
- Map Small: -1.0
- Map Medium: 0.4866
- Map Large: 0.6073
- Mar 1: 0.4641
- Mar 10: 0.7152
- Mar 100: 0.7611
- Mar Small: -1.0
- Mar Medium: 0.6
- Mar Large: 0.788
- Map Banana: 0.4777
- Mar 100 Banana: 0.7243
- Map Orange: 0.5692
- Mar 100 Orange: 0.748
- Map Apple: 0.7117
- Mar 100 Apple: 0.8111
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 51 | 2.3094 | 0.0032 | 0.0108 | 0.0012 | -1.0 | 0.0006 | 0.0049 | 0.0514 | 0.121 | 0.2462 | -1.0 | 0.0167 | 0.2808 | 0.0054 | 0.3351 | 0.0011 | 0.048 | 0.0031 | 0.3556 |
| No log | 2.0 | 102 | 1.8096 | 0.0269 | 0.1103 | 0.0103 | -1.0 | 0.0042 | 0.0305 | 0.1002 | 0.195 | 0.4115 | -1.0 | 0.075 | 0.4555 | 0.0301 | 0.5243 | 0.0345 | 0.188 | 0.016 | 0.5222 |
| No log | 3.0 | 153 | 1.8119 | 0.03 | 0.099 | 0.0095 | -1.0 | 0.005 | 0.0323 | 0.0649 | 0.2336 | 0.3939 | -1.0 | 0.1 | 0.4319 | 0.0489 | 0.4865 | 0.0316 | 0.184 | 0.0096 | 0.5111 |
| No log | 4.0 | 204 | 1.7983 | 0.0501 | 0.1704 | 0.0171 | -1.0 | 0.0828 | 0.0512 | 0.1062 | 0.3072 | 0.4305 | -1.0 | 0.1 | 0.471 | 0.0511 | 0.4541 | 0.0492 | 0.404 | 0.05 | 0.4333 |
| No log | 5.0 | 255 | 1.2361 | 0.1463 | 0.2325 | 0.1539 | -1.0 | 0.2326 | 0.1641 | 0.3315 | 0.5618 | 0.6525 | -1.0 | 0.5 | 0.676 | 0.097 | 0.6486 | 0.1378 | 0.62 | 0.204 | 0.6889 |
| No log | 6.0 | 306 | 1.0533 | 0.1713 | 0.2628 | 0.1951 | -1.0 | 0.2405 | 0.1796 | 0.3315 | 0.6258 | 0.7348 | -1.0 | 0.6 | 0.754 | 0.1014 | 0.7054 | 0.1565 | 0.688 | 0.256 | 0.8111 |
| No log | 7.0 | 357 | 1.0764 | 0.2575 | 0.4103 | 0.2844 | -1.0 | 0.2974 | 0.2843 | 0.3723 | 0.611 | 0.7068 | -1.0 | 0.475 | 0.7347 | 0.1864 | 0.673 | 0.2672 | 0.692 | 0.3189 | 0.7556 |
| No log | 8.0 | 408 | 1.0170 | 0.3077 | 0.4714 | 0.348 | -1.0 | 0.3119 | 0.3328 | 0.3789 | 0.5997 | 0.6755 | -1.0 | 0.4667 | 0.6959 | 0.2122 | 0.6811 | 0.2794 | 0.612 | 0.4316 | 0.7333 |
| No log | 9.0 | 459 | 1.0347 | 0.3821 | 0.5846 | 0.4213 | -1.0 | 0.3855 | 0.418 | 0.4039 | 0.6315 | 0.6732 | -1.0 | 0.425 | 0.7022 | 0.2739 | 0.6351 | 0.3215 | 0.64 | 0.551 | 0.7444 |
| 1.2730 | 10.0 | 510 | 1.1369 | 0.3592 | 0.5588 | 0.3891 | -1.0 | 0.2356 | 0.3974 | 0.415 | 0.6019 | 0.6432 | -1.0 | 0.3 | 0.6907 | 0.2487 | 0.6135 | 0.3243 | 0.616 | 0.5046 | 0.7 |
| 1.2730 | 11.0 | 561 | 1.1140 | 0.3659 | 0.5944 | 0.4177 | -1.0 | 0.3266 | 0.4087 | 0.3804 | 0.6027 | 0.6548 | -1.0 | 0.3667 | 0.6923 | 0.2352 | 0.6243 | 0.3796 | 0.64 | 0.4831 | 0.7 |
| 1.2730 | 12.0 | 612 | 1.0854 | 0.3937 | 0.616 | 0.4196 | -1.0 | 0.2572 | 0.4359 | 0.3976 | 0.6143 | 0.6596 | -1.0 | 0.3667 | 0.7034 | 0.2747 | 0.6108 | 0.4439 | 0.668 | 0.4625 | 0.7 |
| 1.2730 | 13.0 | 663 | 0.9304 | 0.4262 | 0.6495 | 0.4878 | -1.0 | 0.3474 | 0.4593 | 0.4394 | 0.675 | 0.7342 | -1.0 | 0.525 | 0.7656 | 0.3153 | 0.7 | 0.4455 | 0.736 | 0.5179 | 0.7667 |
| 1.2730 | 14.0 | 714 | 1.0531 | 0.4829 | 0.7622 | 0.5441 | -1.0 | 0.4128 | 0.5146 | 0.4108 | 0.6526 | 0.6979 | -1.0 | 0.55 | 0.7225 | 0.3256 | 0.6351 | 0.5017 | 0.692 | 0.6213 | 0.7667 |
| 1.2730 | 15.0 | 765 | 1.0054 | 0.5187 | 0.7912 | 0.6041 | -1.0 | 0.4332 | 0.5588 | 0.4571 | 0.6767 | 0.7041 | -1.0 | 0.4833 | 0.7394 | 0.4085 | 0.6486 | 0.5486 | 0.708 | 0.599 | 0.7556 |
| 1.2730 | 16.0 | 816 | 0.9023 | 0.5074 | 0.7819 | 0.5852 | -1.0 | 0.4293 | 0.5384 | 0.4318 | 0.6843 | 0.7135 | -1.0 | 0.4917 | 0.7458 | 0.4281 | 0.673 | 0.5234 | 0.712 | 0.5708 | 0.7556 |
| 1.2730 | 17.0 | 867 | 0.9242 | 0.5574 | 0.8299 | 0.647 | -1.0 | 0.4564 | 0.5883 | 0.4601 | 0.7117 | 0.7478 | -1.0 | 0.5083 | 0.7855 | 0.4472 | 0.6973 | 0.5308 | 0.724 | 0.6941 | 0.8222 |
| 1.2730 | 18.0 | 918 | 0.8697 | 0.5834 | 0.8647 | 0.667 | -1.0 | 0.529 | 0.6092 | 0.4684 | 0.7229 | 0.7624 | -1.0 | 0.6167 | 0.789 | 0.473 | 0.7 | 0.5921 | 0.776 | 0.685 | 0.8111 |
| 1.2730 | 19.0 | 969 | 0.8877 | 0.5763 | 0.8644 | 0.6543 | -1.0 | 0.4926 | 0.6069 | 0.4809 | 0.7078 | 0.7406 | -1.0 | 0.5417 | 0.7715 | 0.4796 | 0.7081 | 0.5583 | 0.736 | 0.6912 | 0.7778 |
| 0.6854 | 20.0 | 1020 | 0.8976 | 0.5687 | 0.8245 | 0.6662 | -1.0 | 0.4684 | 0.5931 | 0.4709 | 0.7159 | 0.7415 | -1.0 | 0.6167 | 0.7635 | 0.487 | 0.7027 | 0.567 | 0.744 | 0.6523 | 0.7778 |
| 0.6854 | 21.0 | 1071 | 0.8716 | 0.5861 | 0.8264 | 0.6596 | -1.0 | 0.4524 | 0.6182 | 0.4812 | 0.7256 | 0.7691 | -1.0 | 0.625 | 0.7923 | 0.4731 | 0.7243 | 0.5766 | 0.772 | 0.7086 | 0.8111 |
| 0.6854 | 22.0 | 1122 | 0.8568 | 0.5734 | 0.8242 | 0.6524 | -1.0 | 0.4153 | 0.6095 | 0.4776 | 0.725 | 0.7574 | -1.0 | 0.575 | 0.7886 | 0.4585 | 0.7081 | 0.5549 | 0.764 | 0.7067 | 0.8 |
| 0.6854 | 23.0 | 1173 | 0.8485 | 0.5918 | 0.8382 | 0.6733 | -1.0 | 0.4939 | 0.6183 | 0.485 | 0.7348 | 0.775 | -1.0 | 0.65 | 0.7954 | 0.4658 | 0.7324 | 0.5559 | 0.748 | 0.7538 | 0.8444 |
| 0.6854 | 24.0 | 1224 | 0.8602 | 0.5811 | 0.8391 | 0.6776 | -1.0 | 0.4766 | 0.6058 | 0.4845 | 0.727 | 0.7647 | -1.0 | 0.6167 | 0.7904 | 0.4797 | 0.727 | 0.5576 | 0.756 | 0.706 | 0.8111 |
| 0.6854 | 25.0 | 1275 | 0.8444 | 0.5971 | 0.8582 | 0.6964 | -1.0 | 0.4917 | 0.6249 | 0.4619 | 0.7292 | 0.7696 | -1.0 | 0.6167 | 0.7955 | 0.4686 | 0.7378 | 0.5725 | 0.76 | 0.7502 | 0.8111 |
| 0.6854 | 26.0 | 1326 | 0.8446 | 0.5935 | 0.8578 | 0.6823 | -1.0 | 0.5159 | 0.6238 | 0.4776 | 0.7283 | 0.7707 | -1.0 | 0.6 | 0.7991 | 0.4625 | 0.7297 | 0.5817 | 0.76 | 0.7362 | 0.8222 |
| 0.6854 | 27.0 | 1377 | 0.8695 | 0.5911 | 0.8536 | 0.66 | -1.0 | 0.4866 | 0.6163 | 0.4652 | 0.7185 | 0.7617 | -1.0 | 0.6 | 0.7894 | 0.4641 | 0.7189 | 0.5699 | 0.744 | 0.7394 | 0.8222 |
| 0.6854 | 28.0 | 1428 | 0.8621 | 0.5928 | 0.8536 | 0.6787 | -1.0 | 0.4866 | 0.6159 | 0.4652 | 0.7207 | 0.7657 | -1.0 | 0.6 | 0.7936 | 0.4751 | 0.727 | 0.5742 | 0.748 | 0.7289 | 0.8222 |
| 0.6854 | 29.0 | 1479 | 0.8613 | 0.5917 | 0.856 | 0.6635 | -1.0 | 0.4866 | 0.6146 | 0.4678 | 0.7189 | 0.7648 | -1.0 | 0.6 | 0.7927 | 0.4769 | 0.7243 | 0.5705 | 0.748 | 0.7276 | 0.8222 |
| 0.5120 | 30.0 | 1530 | 0.8611 | 0.5862 | 0.8559 | 0.6628 | -1.0 | 0.4866 | 0.6073 | 0.4641 | 0.7152 | 0.7611 | -1.0 | 0.6 | 0.788 | 0.4777 | 0.7243 | 0.5692 | 0.748 | 0.7117 | 0.8111 |
Framework versions
- Transformers 5.0.0
- Pytorch 2.10.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 629
Model tree for joheras/yolo_finetuned_fruits
Base model
hustvl/yolos-tiny