yolo_finetuned_raccoons
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7046
- Map: 0.658
- Map 50: 0.952
- Map 75: 0.7705
- Map Small: -1.0
- Map Medium: 0.3924
- Map Large: 0.6802
- Mar 1: 0.6619
- Mar 10: 0.8119
- Mar 100: 0.8643
- Mar Small: -1.0
- Mar Medium: 0.7667
- Mar Large: 0.8718
- Map Raccoon: 0.658
- Mar 100 Raccoon: 0.8643
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Raccoon | Mar 100 Raccoon |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 40 | 1.4924 | 0.1005 | 0.2393 | 0.0536 | -1.0 | 0.0337 | 0.108 | 0.2643 | 0.4619 | 0.6214 | -1.0 | 0.0667 | 0.6641 | 0.1005 | 0.6214 |
| No log | 2.0 | 80 | 1.2578 | 0.1838 | 0.3781 | 0.1844 | -1.0 | 0.0455 | 0.1962 | 0.3429 | 0.5548 | 0.6857 | -1.0 | 0.1333 | 0.7282 | 0.1838 | 0.6857 |
| No log | 3.0 | 120 | 1.1647 | 0.2632 | 0.4958 | 0.2776 | -1.0 | 0.059 | 0.2815 | 0.3857 | 0.5452 | 0.719 | -1.0 | 0.3 | 0.7513 | 0.2632 | 0.719 |
| No log | 4.0 | 160 | 1.1535 | 0.2156 | 0.4201 | 0.2113 | -1.0 | 0.1655 | 0.2295 | 0.3357 | 0.6024 | 0.7238 | -1.0 | 0.3667 | 0.7513 | 0.2156 | 0.7238 |
| No log | 5.0 | 200 | 1.1569 | 0.2712 | 0.5589 | 0.2293 | -1.0 | 0.2032 | 0.2886 | 0.3976 | 0.6 | 0.7095 | -1.0 | 0.2333 | 0.7462 | 0.2712 | 0.7095 |
| No log | 6.0 | 240 | 1.3297 | 0.3085 | 0.6026 | 0.2708 | -1.0 | 0.1344 | 0.324 | 0.4 | 0.581 | 0.669 | -1.0 | 0.2333 | 0.7026 | 0.3085 | 0.669 |
| No log | 7.0 | 280 | 1.0214 | 0.431 | 0.7435 | 0.483 | -1.0 | 0.2347 | 0.4499 | 0.4929 | 0.7 | 0.7929 | -1.0 | 0.7667 | 0.7949 | 0.431 | 0.7929 |
| No log | 8.0 | 320 | 0.8291 | 0.5124 | 0.7755 | 0.6232 | -1.0 | 0.4072 | 0.5266 | 0.5429 | 0.7905 | 0.8405 | -1.0 | 0.8 | 0.8436 | 0.5124 | 0.8405 |
| No log | 9.0 | 360 | 0.9139 | 0.5458 | 0.858 | 0.5725 | -1.0 | 0.1965 | 0.5704 | 0.5119 | 0.7476 | 0.8143 | -1.0 | 0.7667 | 0.8179 | 0.5458 | 0.8143 |
| No log | 10.0 | 400 | 0.8602 | 0.6176 | 0.8772 | 0.7345 | -1.0 | 0.4388 | 0.6333 | 0.5786 | 0.7786 | 0.8333 | -1.0 | 0.6667 | 0.8462 | 0.6176 | 0.8333 |
| No log | 11.0 | 440 | 0.8314 | 0.6233 | 0.8991 | 0.7409 | -1.0 | 0.4084 | 0.6434 | 0.5976 | 0.769 | 0.8286 | -1.0 | 0.7 | 0.8385 | 0.6233 | 0.8286 |
| No log | 12.0 | 480 | 0.7951 | 0.6299 | 0.907 | 0.7521 | -1.0 | 0.4821 | 0.6457 | 0.6381 | 0.7976 | 0.8429 | -1.0 | 0.7667 | 0.8487 | 0.6299 | 0.8429 |
| 1.0466 | 13.0 | 520 | 0.7849 | 0.6541 | 0.9256 | 0.8048 | -1.0 | 0.459 | 0.6738 | 0.6048 | 0.7929 | 0.8595 | -1.0 | 0.8667 | 0.859 | 0.6541 | 0.8595 |
| 1.0466 | 14.0 | 560 | 0.7600 | 0.6694 | 0.9497 | 0.7896 | -1.0 | 0.4651 | 0.688 | 0.669 | 0.7952 | 0.8524 | -1.0 | 0.8667 | 0.8513 | 0.6694 | 0.8524 |
| 1.0466 | 15.0 | 600 | 0.7345 | 0.6434 | 0.9296 | 0.7621 | -1.0 | 0.492 | 0.6578 | 0.6286 | 0.7738 | 0.8381 | -1.0 | 0.7333 | 0.8462 | 0.6434 | 0.8381 |
| 1.0466 | 16.0 | 640 | 0.7626 | 0.6485 | 0.9363 | 0.7964 | -1.0 | 0.4148 | 0.6752 | 0.6357 | 0.7929 | 0.8405 | -1.0 | 0.7 | 0.8513 | 0.6485 | 0.8405 |
| 1.0466 | 17.0 | 680 | 0.7340 | 0.6734 | 0.9444 | 0.7902 | -1.0 | 0.5253 | 0.691 | 0.6548 | 0.8024 | 0.8524 | -1.0 | 0.7667 | 0.859 | 0.6734 | 0.8524 |
| 1.0466 | 18.0 | 720 | 0.6968 | 0.6594 | 0.9536 | 0.7763 | -1.0 | 0.4151 | 0.6817 | 0.6429 | 0.8024 | 0.869 | -1.0 | 0.8 | 0.8744 | 0.6594 | 0.869 |
| 1.0466 | 19.0 | 760 | 0.7251 | 0.6514 | 0.9568 | 0.7993 | -1.0 | 0.3946 | 0.6734 | 0.65 | 0.8024 | 0.8595 | -1.0 | 0.7 | 0.8718 | 0.6514 | 0.8595 |
| 1.0466 | 20.0 | 800 | 0.6883 | 0.6755 | 0.9532 | 0.7893 | -1.0 | 0.4113 | 0.6979 | 0.6571 | 0.8143 | 0.8857 | -1.0 | 0.8 | 0.8923 | 0.6755 | 0.8857 |
| 1.0466 | 21.0 | 840 | 0.7113 | 0.6548 | 0.9522 | 0.7777 | -1.0 | 0.3957 | 0.6765 | 0.65 | 0.8024 | 0.869 | -1.0 | 0.7667 | 0.8769 | 0.6548 | 0.869 |
| 1.0466 | 22.0 | 880 | 0.7312 | 0.6527 | 0.9488 | 0.7759 | -1.0 | 0.3907 | 0.6758 | 0.6524 | 0.8024 | 0.8595 | -1.0 | 0.7667 | 0.8667 | 0.6527 | 0.8595 |
| 1.0466 | 23.0 | 920 | 0.6949 | 0.6559 | 0.9482 | 0.7656 | -1.0 | 0.4267 | 0.6767 | 0.6595 | 0.8119 | 0.881 | -1.0 | 0.8333 | 0.8846 | 0.6559 | 0.881 |
| 1.0466 | 24.0 | 960 | 0.7081 | 0.6479 | 0.9505 | 0.7725 | -1.0 | 0.3961 | 0.6695 | 0.6548 | 0.8048 | 0.869 | -1.0 | 0.8333 | 0.8718 | 0.6479 | 0.869 |
| 0.5924 | 25.0 | 1000 | 0.7108 | 0.6531 | 0.9518 | 0.77 | -1.0 | 0.4288 | 0.672 | 0.6571 | 0.8143 | 0.8738 | -1.0 | 0.8333 | 0.8769 | 0.6531 | 0.8738 |
| 0.5924 | 26.0 | 1040 | 0.7020 | 0.6576 | 0.9505 | 0.7702 | -1.0 | 0.4284 | 0.6776 | 0.6619 | 0.8119 | 0.869 | -1.0 | 0.8 | 0.8744 | 0.6576 | 0.869 |
| 0.5924 | 27.0 | 1080 | 0.7090 | 0.6596 | 0.9518 | 0.7689 | -1.0 | 0.3925 | 0.6816 | 0.6619 | 0.8119 | 0.8619 | -1.0 | 0.7667 | 0.8692 | 0.6596 | 0.8619 |
| 0.5924 | 28.0 | 1120 | 0.7043 | 0.6589 | 0.952 | 0.7702 | -1.0 | 0.3927 | 0.6807 | 0.6595 | 0.8167 | 0.8643 | -1.0 | 0.7667 | 0.8718 | 0.6589 | 0.8643 |
| 0.5924 | 29.0 | 1160 | 0.7049 | 0.6558 | 0.952 | 0.7704 | -1.0 | 0.3925 | 0.6776 | 0.6595 | 0.8143 | 0.8643 | -1.0 | 0.7667 | 0.8718 | 0.6558 | 0.8643 |
| 0.5924 | 30.0 | 1200 | 0.7046 | 0.658 | 0.952 | 0.7705 | -1.0 | 0.3924 | 0.6802 | 0.6619 | 0.8119 | 0.8643 | -1.0 | 0.7667 | 0.8718 | 0.658 | 0.8643 |
Framework versions
- Transformers 4.57.6
- Pytorch 2.9.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 3
Model tree for magomerob/yolo_finetuned_raccoons
Base model
hustvl/yolos-tiny