rtdetr-tray-cart

This model is a fine-tuned version of PekingU/rtdetr_r101vd_coco_o365 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 8.0013
  • Map: 0.3254
  • Map 50: 0.5528
  • Map 75: 0.3324
  • Map Small: 0.6086
  • Map Medium: 0.3016
  • Map Large: 0.5975
  • Mar 1: 0.0588
  • Mar 10: 0.3009
  • Mar 100: 0.5992
  • Mar Small: 0.675
  • Mar Medium: 0.5598
  • Mar Large: 0.8468
  • Map Tray: 0.2404
  • Mar 100 Tray: 0.505
  • Map Cart: 0.4104
  • Mar 100 Cart: 0.6935

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Tray Mar 100 Tray Map Cart Mar 100 Cart
No log 1.0 13 19.9337 0.002 0.0076 0.0004 0.0 0.0027 0.0154 0.0 0.0057 0.0623 0.0 0.0603 0.0895 0.0034 0.066 0.0006 0.0587
No log 2.0 26 22.1722 0.0027 0.0084 0.0012 0.0001 0.0052 0.0111 0.0 0.0057 0.0701 0.0139 0.0649 0.1323 0.0049 0.0685 0.0004 0.0717
No log 3.0 39 17.1149 0.0113 0.0298 0.0068 0.0591 0.0127 0.042 0.0008 0.0178 0.105 0.3056 0.0713 0.2169 0.0201 0.1273 0.0026 0.0826
No log 4.0 52 10.9760 0.1099 0.1888 0.1182 0.2553 0.1004 0.2333 0.0137 0.0908 0.2935 0.2722 0.2621 0.5581 0.2089 0.4088 0.0108 0.1783
No log 5.0 65 10.3979 0.1323 0.2611 0.1276 0.5042 0.1206 0.2902 0.0227 0.1687 0.376 0.5306 0.3309 0.6456 0.2317 0.4215 0.0328 0.3304
No log 6.0 78 10.0574 0.183 0.3502 0.1716 0.5487 0.1395 0.4027 0.0274 0.1741 0.4055 0.6528 0.3589 0.6032 0.2429 0.4392 0.123 0.3717
No log 7.0 91 9.9984 0.2236 0.4243 0.2019 0.5683 0.1733 0.4474 0.0493 0.2187 0.419 0.6167 0.3656 0.7052 0.2473 0.4336 0.2 0.4043
No log 8.0 104 9.6693 0.2453 0.4586 0.2238 0.6032 0.268 0.2727 0.0481 0.2637 0.4929 0.6444 0.4468 0.7573 0.1751 0.4401 0.3156 0.5457
No log 9.0 117 8.7372 0.3333 0.5783 0.3575 0.6196 0.3265 0.3584 0.0581 0.2821 0.5441 0.6639 0.5068 0.7573 0.2562 0.4731 0.4103 0.6152
No log 10.0 130 8.8186 0.2482 0.4278 0.2468 0.5817 0.2468 0.3644 0.0468 0.2827 0.558 0.6028 0.5278 0.7593 0.2212 0.4639 0.2753 0.6522
No log 11.0 143 8.5718 0.2821 0.501 0.2732 0.6277 0.2546 0.4392 0.057 0.2703 0.5479 0.6278 0.5104 0.7843 0.2289 0.4762 0.3353 0.6196
No log 12.0 156 8.7045 0.3346 0.5686 0.3482 0.6173 0.3073 0.394 0.0506 0.2959 0.5566 0.6472 0.5207 0.7589 0.2881 0.4806 0.3811 0.6326
No log 13.0 169 8.3602 0.3594 0.592 0.3995 0.6538 0.3268 0.4867 0.0641 0.2966 0.5952 0.675 0.5618 0.7895 0.3103 0.5013 0.4086 0.6891
No log 14.0 182 8.4438 0.3271 0.5673 0.3338 0.5406 0.3084 0.4213 0.0661 0.28 0.5717 0.6528 0.535 0.7927 0.2647 0.4848 0.3895 0.6587
No log 15.0 195 8.2520 0.335 0.5597 0.3426 0.6311 0.3032 0.5211 0.0562 0.3073 0.58 0.65 0.5397 0.8435 0.2978 0.4948 0.3722 0.6652
No log 16.0 208 8.1700 0.3065 0.5151 0.308 0.6639 0.2903 0.4197 0.0568 0.3025 0.5802 0.7167 0.5423 0.7815 0.2632 0.4973 0.3498 0.663
No log 17.0 221 8.0011 0.3557 0.5819 0.3839 0.6318 0.3133 0.5577 0.061 0.3042 0.5911 0.6333 0.5585 0.796 0.2978 0.5084 0.4136 0.6739
No log 18.0 234 7.9459 0.3566 0.5752 0.3869 0.6684 0.3313 0.5498 0.0578 0.3123 0.614 0.7139 0.5762 0.8351 0.2848 0.5171 0.4284 0.7109
No log 19.0 247 7.9807 0.3556 0.5988 0.3768 0.6516 0.323 0.5591 0.0581 0.3112 0.5848 0.6694 0.5468 0.8177 0.2875 0.5088 0.4237 0.6609
No log 20.0 260 8.2510 0.3793 0.6172 0.3966 0.648 0.3517 0.527 0.0651 0.3108 0.5999 0.7139 0.5596 0.8294 0.2839 0.4846 0.4746 0.7152
No log 21.0 273 8.0173 0.3769 0.6308 0.3901 0.6581 0.348 0.5505 0.0537 0.3094 0.5865 0.7028 0.5526 0.7629 0.2872 0.4992 0.4666 0.6739
No log 22.0 286 8.1768 0.3486 0.5852 0.3254 0.6257 0.3176 0.5186 0.0645 0.3031 0.589 0.6611 0.5523 0.8109 0.2765 0.491 0.4207 0.687
No log 23.0 299 8.1836 0.3417 0.5674 0.3576 0.6202 0.3257 0.5427 0.0599 0.3157 0.5813 0.6639 0.5458 0.7895 0.2469 0.4931 0.4365 0.6696
No log 24.0 312 8.1504 0.3545 0.5978 0.3692 0.6435 0.3284 0.5889 0.058 0.3149 0.5825 0.6639 0.5447 0.8093 0.2753 0.491 0.4338 0.6739
No log 25.0 325 7.8980 0.3769 0.6108 0.4214 0.6247 0.3486 0.5719 0.0639 0.3131 0.5976 0.6583 0.5631 0.8085 0.2873 0.5017 0.4666 0.6935
No log 26.0 338 7.9985 0.3495 0.5762 0.3213 0.6342 0.3189 0.5759 0.0515 0.3032 0.6023 0.6583 0.5678 0.8294 0.2508 0.5046 0.4483 0.7
No log 27.0 351 7.8948 0.3738 0.6191 0.4456 0.6332 0.3502 0.5408 0.0643 0.3083 0.5954 0.6389 0.5632 0.8 0.2775 0.5104 0.4701 0.6804
No log 28.0 364 7.9396 0.3658 0.6121 0.3443 0.5331 0.3397 0.5885 0.0645 0.3105 0.5972 0.6444 0.5625 0.821 0.2744 0.5052 0.4571 0.6891
No log 29.0 377 7.9732 0.3522 0.5863 0.3728 0.5942 0.3264 0.5782 0.0627 0.3028 0.5956 0.65 0.5609 0.8177 0.2709 0.5086 0.4335 0.6826
No log 30.0 390 8.1223 0.3404 0.5936 0.3081 0.6316 0.3191 0.5283 0.05 0.2941 0.5913 0.6861 0.5534 0.8153 0.2474 0.4935 0.4334 0.6891
No log 31.0 403 7.9385 0.3279 0.5698 0.2987 0.5828 0.3084 0.5489 0.0579 0.2968 0.5883 0.65 0.5511 0.8294 0.2384 0.4962 0.4174 0.6804
No log 32.0 416 7.9805 0.3328 0.5745 0.3031 0.5977 0.3111 0.5616 0.0594 0.2946 0.5881 0.6667 0.5528 0.796 0.24 0.5023 0.4257 0.6739
No log 33.0 429 7.9656 0.3372 0.5745 0.3294 0.5334 0.3199 0.5599 0.0503 0.3032 0.5959 0.6639 0.5629 0.7911 0.2371 0.5092 0.4374 0.6826
No log 34.0 442 7.7492 0.3418 0.5842 0.3344 0.6367 0.3147 0.606 0.059 0.3082 0.5934 0.6667 0.5576 0.8101 0.2523 0.5106 0.4313 0.6761
No log 35.0 455 7.9115 0.3336 0.5782 0.3039 0.6315 0.3075 0.6255 0.0594 0.3012 0.5928 0.6639 0.5537 0.8435 0.2453 0.5073 0.4219 0.6783
No log 36.0 468 7.8646 0.3414 0.5791 0.3556 0.6197 0.3159 0.6261 0.0547 0.301 0.5994 0.6639 0.5627 0.8327 0.2447 0.5119 0.4381 0.687
No log 37.0 481 7.9918 0.3504 0.5905 0.3812 0.6226 0.3246 0.6256 0.0507 0.2982 0.5922 0.65 0.5576 0.8093 0.2403 0.5104 0.4604 0.6739
No log 38.0 494 7.9649 0.3396 0.5715 0.3579 0.615 0.317 0.584 0.0589 0.3046 0.6018 0.6611 0.5628 0.8585 0.2443 0.5079 0.4348 0.6957
13.3406 39.0 507 7.9482 0.328 0.5557 0.3271 0.5964 0.3056 0.5675 0.059 0.2983 0.5949 0.6444 0.5575 0.8444 0.2501 0.5115 0.4058 0.6783
13.3406 40.0 520 7.9461 0.3255 0.5618 0.3063 0.5913 0.305 0.5877 0.058 0.299 0.5888 0.6583 0.5516 0.8202 0.2427 0.5038 0.4084 0.6739
13.3406 41.0 533 7.9860 0.3074 0.5349 0.2944 0.6136 0.2866 0.5951 0.0507 0.2954 0.5946 0.6611 0.5582 0.821 0.2397 0.5067 0.3751 0.6826
13.3406 42.0 546 7.9835 0.3154 0.5395 0.3094 0.6094 0.2927 0.6031 0.0584 0.2971 0.5951 0.6583 0.559 0.8202 0.2425 0.5075 0.3883 0.6826
13.3406 43.0 559 8.0506 0.3225 0.5535 0.3255 0.5259 0.3027 0.6104 0.0584 0.296 0.5891 0.6444 0.5528 0.8218 0.2359 0.5021 0.409 0.6761
13.3406 44.0 572 8.0112 0.3321 0.5729 0.3363 0.6113 0.3089 0.5915 0.0516 0.2966 0.5892 0.6611 0.552 0.8194 0.2409 0.5023 0.4233 0.6761
13.3406 45.0 585 7.9716 0.3259 0.5551 0.3343 0.6304 0.3031 0.6173 0.0592 0.2969 0.5952 0.6583 0.559 0.821 0.2432 0.5056 0.4086 0.6848
13.3406 46.0 598 7.9923 0.3291 0.5642 0.3335 0.6106 0.3043 0.6146 0.0589 0.2943 0.5954 0.6806 0.5563 0.8343 0.2424 0.5061 0.4158 0.6848
13.3406 47.0 611 8.0093 0.3266 0.5565 0.3375 0.6017 0.3035 0.6104 0.0512 0.2999 0.5948 0.6583 0.5561 0.8452 0.2408 0.5048 0.4124 0.6848
13.3406 48.0 624 7.9778 0.3309 0.5626 0.3342 0.603 0.3062 0.6507 0.0588 0.2981 0.5993 0.675 0.5602 0.8444 0.2396 0.5029 0.4222 0.6957
13.3406 49.0 637 7.9963 0.3285 0.5612 0.3368 0.6042 0.3051 0.6123 0.0589 0.3028 0.5981 0.6583 0.5598 0.8468 0.2432 0.5071 0.4137 0.6891
13.3406 50.0 650 8.0013 0.3254 0.5528 0.3324 0.6086 0.3016 0.5975 0.0588 0.3009 0.5992 0.675 0.5598 0.8468 0.2404 0.505 0.4104 0.6935

Framework versions

  • Transformers 5.2.0
  • Pytorch 2.10.0+cu128
  • Datasets 4.6.1
  • Tokenizers 0.22.2
Downloads last month
25
Safetensors
Model size
76.6M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nielsr/rtdetr-tray-cart

Finetuned
(10)
this model