segformer-b5-finetuned-apple-dms-run3

This model is a fine-tuned version of nvidia/segformer-b5-finetuned-ade-640-640 on the AllanK24/apple-dms-materials dataset. It achieves the following results on the evaluation set:

  • Loss: 11.7839
  • Mean Iou: 0.4203
  • Mean Accuracy: 0.4907
  • Overall Accuracy: 0.7320
  • Iou No Label: 0.0
  • Accuracy Animal Skin: 0.5754
  • Iou Animal Skin: 0.3556
  • Accuracy Bone Teeth Horn: 0.0028
  • Iou Bone Teeth Horn: 0.0028
  • Accuracy Brickwork: 0.6283
  • Iou Brickwork: 0.5529
  • Accuracy Cardboard: 0.5400
  • Iou Cardboard: 0.4636
  • Accuracy Carpet Rug: 0.7694
  • Iou Carpet Rug: 0.7043
  • Accuracy Ceiling Tile: 0.7873
  • Iou Ceiling Tile: 0.7258
  • Accuracy Ceramic: 0.7227
  • Iou Ceramic: 0.6279
  • Accuracy Chalkboard Blackboard: 0.7058
  • Iou Chalkboard Blackboard: 0.6366
  • Accuracy Clutter: 0.0031
  • Iou Clutter: 0.0028
  • Accuracy Concrete: 0.4385
  • Iou Concrete: 0.3509
  • Accuracy Cork Corkboard: 0.0203
  • Iou Cork Corkboard: 0.0191
  • Accuracy Engineered Stone: 0.0238
  • Iou Engineered Stone: 0.0213
  • Accuracy Fabric Cloth: 0.8462
  • Iou Fabric Cloth: 0.7791
  • Accuracy Fiberglass Wool: 0.0
  • Iou Fiberglass Wool: 0.0
  • Accuracy Fire: 0.3918
  • Iou Fire: 0.3545
  • Accuracy Foliage: 0.9080
  • Iou Foliage: 0.8384
  • Accuracy Food: 0.8727
  • Iou Food: 0.7758
  • Accuracy Fur: 0.8913
  • Iou Fur: 0.8047
  • Accuracy Gemstone Quartz: 0.2157
  • Iou Gemstone Quartz: 0.1761
  • Accuracy Glass: 0.6599
  • Iou Glass: 0.5828
  • Accuracy Hair: 0.8424
  • Iou Hair: 0.7460
  • Accuracy Ice: 0.3504
  • Iou Ice: 0.2609
  • Accuracy Leather: 0.3762
  • Iou Leather: 0.3338
  • Accuracy Liquid Non-water: 0.2289
  • Iou Liquid Non-water: 0.2157
  • Accuracy Metal: 0.3665
  • Iou Metal: 0.3088
  • Accuracy Mirror: 0.4631
  • Iou Mirror: 0.4021
  • Accuracy Paint Plaster Enamel: 0.7616
  • Iou Paint Plaster Enamel: 0.6999
  • Accuracy Paper: 0.6700
  • Iou Paper: 0.5762
  • Accuracy Pearl: 0.0
  • Iou Pearl: 0.0
  • Accuracy Photograph Painting: 0.3697
  • Iou Photograph Painting: 0.3016
  • Accuracy Plastic Clear: 0.3195
  • Iou Plastic Clear: 0.2717
  • Accuracy Plastic Non-clear: 0.3865
  • Iou Plastic Non-clear: 0.3220
  • Accuracy Rubber Latex: 0.3122
  • Iou Rubber Latex: 0.2716
  • Accuracy Sand: 0.5707
  • Iou Sand: 0.4479
  • Accuracy Skin Lips: 0.8519
  • Iou Skin Lips: 0.7560
  • Accuracy Sky: 0.9708
  • Iou Sky: 0.9418
  • Accuracy Snow: 0.7146
  • Iou Snow: 0.6191
  • Accuracy Soap: 0.0
  • Iou Soap: 0.0
  • Accuracy Soil Mud: 0.5586
  • Iou Soil Mud: 0.4173
  • Accuracy Sponge: 0.0
  • Iou Sponge: 0.0
  • Accuracy Stone Natural: 0.6606
  • Iou Stone Natural: 0.5339
  • Accuracy Stone Polished: 0.2223
  • Iou Stone Polished: 0.1965
  • Accuracy Styrofoam: 0.0
  • Iou Styrofoam: 0.0
  • Accuracy Tile: 0.6934
  • Iou Tile: 0.6312
  • Accuracy Wallpaper: 0.5224
  • Iou Wallpaper: 0.4148
  • Accuracy Water: 0.9011
  • Iou Water: 0.8268
  • Accuracy Wax: 0.3265
  • Iou Wax: 0.3251
  • Accuracy Whiteboard: 0.7606
  • Iou Whiteboard: 0.6864
  • Accuracy Wicker: 0.4968
  • Iou Wicker: 0.4467
  • Accuracy Wood: 0.7456
  • Iou Wood: 0.6849
  • Accuracy Wood Tree: 0.4807
  • Iou Wood Tree: 0.3870
  • Accuracy Asphalt: 0.5909
  • Iou Asphalt: 0.4742

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 8
  • total_train_batch_size: 256
  • total_eval_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 0.2
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Iou No Label Accuracy Animal Skin Iou Animal Skin Accuracy Bone Teeth Horn Iou Bone Teeth Horn Accuracy Brickwork Iou Brickwork Accuracy Cardboard Iou Cardboard Accuracy Carpet Rug Iou Carpet Rug Accuracy Ceiling Tile Iou Ceiling Tile Accuracy Ceramic Iou Ceramic Accuracy Chalkboard Blackboard Iou Chalkboard Blackboard Accuracy Clutter Iou Clutter Accuracy Concrete Iou Concrete Accuracy Cork Corkboard Iou Cork Corkboard Accuracy Engineered Stone Iou Engineered Stone Accuracy Fabric Cloth Iou Fabric Cloth Accuracy Fiberglass Wool Iou Fiberglass Wool Accuracy Fire Iou Fire Accuracy Foliage Iou Foliage Accuracy Food Iou Food Accuracy Fur Iou Fur Accuracy Gemstone Quartz Iou Gemstone Quartz Accuracy Glass Iou Glass Accuracy Hair Iou Hair Accuracy Ice Iou Ice Accuracy Leather Iou Leather Accuracy Liquid Non-water Iou Liquid Non-water Accuracy Metal Iou Metal Accuracy Mirror Iou Mirror Accuracy Paint Plaster Enamel Iou Paint Plaster Enamel Accuracy Paper Iou Paper Accuracy Pearl Iou Pearl Accuracy Photograph Painting Iou Photograph Painting Accuracy Plastic Clear Iou Plastic Clear Accuracy Plastic Non-clear Iou Plastic Non-clear Accuracy Rubber Latex Iou Rubber Latex Accuracy Sand Iou Sand Accuracy Skin Lips Iou Skin Lips Accuracy Sky Iou Sky Accuracy Snow Iou Snow Accuracy Soap Iou Soap Accuracy Soil Mud Iou Soil Mud Accuracy Sponge Iou Sponge Accuracy Stone Natural Iou Stone Natural Accuracy Stone Polished Iou Stone Polished Accuracy Styrofoam Iou Styrofoam Accuracy Tile Iou Tile Accuracy Wallpaper Iou Wallpaper Accuracy Water Iou Water Accuracy Wax Iou Wax Accuracy Whiteboard Iou Whiteboard Accuracy Wicker Iou Wicker Accuracy Wood Iou Wood Accuracy Wood Tree Iou Wood Tree Accuracy Asphalt Iou Asphalt
23.2591 4.5455 400 13.8395 0.1470 0.1862 0.6536 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.7884 0.6442 0.6791 0.6385 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.8633 0.6991 0.0 0.0 0.0 0.0 0.9553 0.5771 0.2769 0.2726 0.0 0.0 0.0 0.0 0.5524 0.4723 0.1713 0.1690 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.8369 0.6781 0.5544 0.4156 0.0 0.0 0.0 0.0 0.0 0.0 0.0134 0.0131 0.0 0.0 0.0 0.0 0.6679 0.4211 0.9623 0.9005 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0208 0.0199 0.0 0.0 0.0 0.0 0.6591 0.5641 0.0 0.0 0.8817 0.6566 0.0 0.0 0.0 0.0 0.0 0.0 0.7994 0.6479 0.0 0.0 0.0 0.0
9.1725 9.0909 800 8.9969 0.2906 0.3404 0.7036 0.0 0.0 0.0 0.0 0.0 0.5746 0.4968 0.0877 0.0874 0.8142 0.7217 0.8229 0.7461 0.5568 0.5112 0.0 0.0 0.0 0.0 0.3627 0.3073 0.0 0.0 0.0 0.0 0.8609 0.7657 0.0 0.0 0.0 0.0 0.9110 0.8009 0.8586 0.7565 0.8423 0.7348 0.0 0.0 0.6727 0.5680 0.7639 0.6660 0.0 0.0 0.4000 0.3719 0.0 0.0 0.2393 0.2182 0.4064 0.3614 0.7567 0.6891 0.5823 0.5151 0.0 0.0 0.1694 0.1603 0.0 0.0 0.3700 0.3025 0.0 0.0 0.0 0.0 0.7966 0.6699 0.9678 0.9209 0.0 0.0 0.0 0.0 0.5291 0.3117 0.0 0.0 0.6160 0.4794 0.0 0.0 0.0 0.0 0.6652 0.6036 0.4365 0.3794 0.8898 0.7867 0.0 0.0 0.6625 0.4786 0.0 0.0 0.7186 0.6605 0.0 0.0 0.3648 0.3294
5.6198 13.6364 1200 9.1325 0.3624 0.4234 0.7157 0.0 0.4067 0.2758 0.0 0.0 0.6107 0.5291 0.4344 0.3906 0.6586 0.6181 0.8505 0.7557 0.6837 0.5871 0.7123 0.6449 0.0 0.0 0.4172 0.3205 0.0 0.0 0.0 0.0 0.8222 0.7590 0.0 0.0 0.0 0.0 0.9124 0.8205 0.8812 0.7713 0.8405 0.7616 0.0 0.0 0.6625 0.5726 0.8104 0.7068 0.0 0.0 0.2324 0.2197 0.0 0.0 0.3667 0.3067 0.4322 0.3839 0.7636 0.6965 0.6618 0.5549 0.0 0.0 0.3177 0.2574 0.2445 0.2072 0.3408 0.2888 0.0128 0.0128 0.3770 0.3158 0.8249 0.7035 0.9648 0.9258 0.6217 0.5548 0.0 0.0 0.4869 0.3403 0.0 0.0 0.6749 0.5090 0.2337 0.2010 0.0 0.0 0.6966 0.6221 0.5075 0.4223 0.8979 0.8136 0.0 0.0 0.6738 0.6149 0.4526 0.4111 0.7180 0.6636 0.3129 0.2699 0.4980 0.3955
4.2141 18.1818 1600 9.6546 0.3807 0.4547 0.7338 0.0 0.5327 0.3111 0.0 0.0 0.6684 0.5487 0.5404 0.4648 0.7323 0.6697 0.7930 0.7259 0.6798 0.5899 0.7361 0.6591 0.0 0.0 0.4286 0.3427 0.0 0.0 0.0 0.0 0.8493 0.7728 0.0 0.0 0.0 0.0 0.9188 0.8270 0.8819 0.7639 0.8914 0.7938 0.0 0.0 0.6376 0.5625 0.8344 0.7277 0.0012 0.0012 0.3937 0.3468 0.0 0.0 0.3476 0.2984 0.5061 0.3964 0.7746 0.7034 0.6598 0.5531 0.0 0.0 0.3919 0.3055 0.2772 0.2286 0.3813 0.3160 0.2416 0.2118 0.4754 0.3893 0.8287 0.7256 0.9666 0.9260 0.7738 0.5683 0.0 0.0 0.5030 0.3712 0.0 0.0 0.6483 0.5063 0.2448 0.2039 0.0 0.0 0.7022 0.6262 0.5501 0.4202 0.8853 0.8122 0.0 0.0 0.7265 0.6311 0.4756 0.4223 0.7654 0.6859 0.4262 0.3134 0.5736 0.4572
3.5156 22.7273 2000 10.1557 0.3948 0.4683 0.7236 0.0 0.6345 0.3247 0.0 0.0 0.6522 0.5595 0.5211 0.4645 0.7087 0.6577 0.7127 0.6737 0.7282 0.6211 0.6796 0.6168 0.0 0.0 0.4782 0.3628 0.0 0.0 0.0 0.0 0.8433 0.7735 0.0 0.0 0.3067 0.2605 0.9130 0.8356 0.8713 0.7633 0.8657 0.7822 0.0 0.0 0.6961 0.5917 0.8403 0.7350 0.3328 0.2865 0.5103 0.4189 0.0 0.0 0.3771 0.3129 0.4079 0.3679 0.7586 0.6939 0.6052 0.5408 0.0 0.0 0.3144 0.2552 0.2957 0.2502 0.3759 0.3118 0.2715 0.2310 0.6055 0.4525 0.8323 0.7353 0.9642 0.9363 0.7553 0.6282 0.0 0.0 0.5383 0.3959 0.0 0.0 0.6861 0.5398 0.2991 0.2084 0.0 0.0 0.6631 0.6003 0.5337 0.4122 0.8987 0.8216 0.0 0.0 0.6954 0.6182 0.4685 0.4142 0.6995 0.6522 0.4855 0.3605 0.5259 0.4586
3.0893 27.2727 2400 10.5690 0.3958 0.4659 0.7267 0.0 0.6126 0.3430 0.0 0.0 0.6235 0.5588 0.5181 0.4529 0.7599 0.6980 0.6838 0.6419 0.7167 0.6208 0.7022 0.6319 0.0 0.0 0.4359 0.3399 0.0 0.0 0.0045 0.0043 0.8301 0.7675 0.0 0.0 0.3497 0.2765 0.9015 0.8328 0.8690 0.7721 0.8676 0.7842 0.0168 0.0165 0.6729 0.5880 0.8457 0.7426 0.3620 0.2570 0.4232 0.3742 0.0 0.0 0.3515 0.2973 0.4294 0.3835 0.7767 0.7065 0.6667 0.5710 0.0 0.0 0.3710 0.2935 0.3306 0.2682 0.3258 0.2817 0.2801 0.2522 0.5744 0.4381 0.8406 0.7441 0.9755 0.9352 0.7163 0.5920 0.0 0.0 0.5348 0.4076 0.0 0.0 0.6487 0.5306 0.2212 0.1966 0.0 0.0 0.6759 0.6141 0.5072 0.4124 0.8816 0.8108 0.0 0.0 0.7437 0.6559 0.4440 0.4076 0.7220 0.6678 0.4594 0.3545 0.5561 0.4552
2.7957 31.8182 2800 11.0041 0.3994 0.4689 0.7228 0.0 0.6209 0.3772 0.0 0.0 0.6268 0.5629 0.4829 0.4400 0.7616 0.6984 0.8295 0.7525 0.7106 0.6199 0.6899 0.6288 0.0 0.0 0.4257 0.3365 0.0 0.0 0.0123 0.0110 0.8358 0.7708 0.0 0.0 0.3749 0.3116 0.9164 0.8389 0.8562 0.7649 0.8689 0.7808 0.0682 0.0512 0.6356 0.5654 0.8377 0.7455 0.3943 0.2643 0.3618 0.3256 0.0000 0.0000 0.3654 0.3078 0.4544 0.3935 0.7504 0.6909 0.6411 0.5556 0.0 0.0 0.3419 0.2757 0.3435 0.2769 0.3968 0.3235 0.3186 0.2756 0.5221 0.4145 0.8541 0.7497 0.9672 0.9390 0.7496 0.6174 0.0 0.0 0.5484 0.4113 0.0 0.0 0.6662 0.5300 0.2195 0.1908 0.0 0.0 0.6858 0.6244 0.4828 0.3995 0.8847 0.8189 0.0 0.0 0.7189 0.6406 0.4404 0.3973 0.7410 0.6780 0.4740 0.3658 0.5034 0.4458
2.5825 36.3636 3200 11.2353 0.4077 0.4785 0.7315 0.0 0.5945 0.3841 0.0 0.0 0.6424 0.5588 0.5305 0.4550 0.7683 0.6996 0.8011 0.7344 0.7088 0.6199 0.7103 0.6457 0.0 0.0 0.4373 0.3468 0.0 0.0 0.0434 0.0340 0.8433 0.7757 0.0 0.0 0.3865 0.3507 0.9046 0.8369 0.8771 0.7722 0.8795 0.7884 0.0503 0.0299 0.6320 0.5653 0.8392 0.7483 0.3621 0.2783 0.4448 0.3863 0.0970 0.0957 0.3473 0.2979 0.4391 0.3873 0.7739 0.7057 0.6514 0.5623 0.0 0.0 0.3540 0.2948 0.2827 0.2454 0.3545 0.3015 0.3201 0.2655 0.5504 0.4626 0.8503 0.7523 0.9710 0.9405 0.7998 0.6609 0.0 0.0 0.5126 0.3917 0.0 0.0 0.6944 0.5310 0.2704 0.2230 0.0 0.0 0.6933 0.6269 0.5628 0.4228 0.9002 0.8290 0.0 0.0 0.7569 0.6757 0.4504 0.4096 0.7410 0.6817 0.4752 0.3809 0.5766 0.4536
2.4282 40.9091 3600 11.5373 0.4132 0.4845 0.7272 0.0 0.5829 0.3698 0.0 0.0 0.6306 0.5552 0.5290 0.4568 0.7755 0.7076 0.7498 0.7086 0.7273 0.6309 0.7192 0.6447 0.0001 0.0001 0.4090 0.3263 0.0447 0.0423 0.0444 0.0377 0.8414 0.7785 0.0 0.0 0.4448 0.3992 0.9062 0.8385 0.8763 0.7731 0.8858 0.8058 0.1693 0.1139 0.6577 0.5815 0.8423 0.7411 0.3545 0.2498 0.4091 0.3627 0.2365 0.2140 0.4018 0.3260 0.4800 0.4102 0.7566 0.6961 0.6409 0.5622 0.0 0.0 0.3895 0.3156 0.3329 0.2735 0.3817 0.3187 0.3173 0.2736 0.5735 0.4458 0.8479 0.7531 0.9708 0.9382 0.7379 0.6305 0.0 0.0 0.5262 0.4022 0.0 0.0 0.6644 0.5274 0.2263 0.1998 0.0 0.0 0.6694 0.6145 0.4785 0.3923 0.9003 0.8277 0.0056 0.0056 0.7635 0.6862 0.4754 0.4292 0.7357 0.6771 0.4783 0.3823 0.6010 0.4719
2.3205 45.4545 4000 11.7263 0.4159 0.4866 0.7328 0.0 0.5663 0.3607 0.0018 0.0018 0.6217 0.5483 0.5611 0.4707 0.7732 0.7045 0.8032 0.7357 0.7234 0.6292 0.7014 0.6347 0.0039 0.0036 0.4299 0.3461 0.0157 0.0149 0.0177 0.0158 0.8511 0.7809 0.0 0.0 0.3897 0.3384 0.9069 0.8385 0.8682 0.7738 0.8900 0.8054 0.1158 0.0913 0.6518 0.5789 0.8454 0.7460 0.3505 0.2713 0.3802 0.3371 0.2234 0.2104 0.3517 0.3007 0.4591 0.3993 0.7630 0.7004 0.6742 0.5780 0.0 0.0 0.3805 0.3088 0.3165 0.2702 0.3783 0.3165 0.3215 0.2760 0.5748 0.4497 0.8513 0.7557 0.9698 0.9429 0.7618 0.6583 0.0 0.0 0.5656 0.4170 0.0 0.0 0.6628 0.5355 0.2233 0.1971 0.0 0.0 0.6894 0.6289 0.5044 0.4070 0.8995 0.8282 0.1616 0.1615 0.7561 0.6793 0.5033 0.4504 0.7543 0.6892 0.4970 0.3871 0.5889 0.4651
2.2572 50.0 4400 11.7839 0.4203 0.4907 0.7320 0.0 0.5754 0.3556 0.0028 0.0028 0.6283 0.5529 0.5400 0.4636 0.7694 0.7043 0.7873 0.7258 0.7227 0.6279 0.7058 0.6366 0.0031 0.0028 0.4385 0.3509 0.0203 0.0191 0.0238 0.0213 0.8462 0.7791 0.0 0.0 0.3918 0.3545 0.9080 0.8384 0.8727 0.7758 0.8913 0.8047 0.2157 0.1761 0.6599 0.5828 0.8424 0.7460 0.3504 0.2609 0.3762 0.3338 0.2289 0.2157 0.3665 0.3088 0.4631 0.4021 0.7616 0.6999 0.6700 0.5762 0.0 0.0 0.3697 0.3016 0.3195 0.2717 0.3865 0.3220 0.3122 0.2716 0.5707 0.4479 0.8519 0.7560 0.9708 0.9418 0.7146 0.6191 0.0 0.0 0.5586 0.4173 0.0 0.0 0.6606 0.5339 0.2223 0.1965 0.0 0.0 0.6934 0.6312 0.5224 0.4148 0.9011 0.8268 0.3265 0.3251 0.7606 0.6864 0.4968 0.4467 0.7456 0.6849 0.4807 0.3870 0.5909 0.4742

Framework versions

  • Transformers 5.0.0
  • Pytorch 2.9.1+cu128
  • Datasets 4.5.0
  • Tokenizers 0.22.2
Downloads last month
7
Safetensors
Model size
84.6M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for AllanK24/segformer-b5-finetuned-apple-dms-run3

Finetuned
(22)
this model