mask2former-swin-base-apple-dms-run3

This model is a fine-tuned version of facebook/mask2former-swin-base-ade-semantic on the AllanK24/apple-dms-materials dataset. It achieves the following results on the evaluation set:

  • Mean Iou: 0.4055
  • Mean Accuracy: 0.4571
  • Overall Accuracy: 0.7261
  • Iou Animal Skin: 0.2844
  • Accuracy Animal Skin: 0.3396
  • Iou Bone Teeth Horn: 0.1287
  • Accuracy Bone Teeth Horn: 0.1353
  • Iou Brickwork: 0.5506
  • Accuracy Brickwork: 0.6007
  • Iou Cardboard: 0.4753
  • Accuracy Cardboard: 0.5163
  • Iou Carpet Rug: 0.7248
  • Accuracy Carpet Rug: 0.7993
  • Iou Ceiling Tile: 0.7318
  • Accuracy Ceiling Tile: 0.7822
  • Iou Ceramic: 0.6421
  • Accuracy Ceramic: 0.7324
  • Iou Chalkboard Blackboard: 0.6229
  • Accuracy Chalkboard Blackboard: 0.6613
  • Iou Clutter: 0.0
  • Accuracy Clutter: 0.0
  • Iou Concrete: 0.2897
  • Accuracy Concrete: 0.3533
  • Iou Cork Corkboard: 0.0
  • Accuracy Cork Corkboard: 0.0
  • Iou Engineered Stone: 0.0
  • Accuracy Engineered Stone: 0.0
  • Iou Fabric Cloth: 0.7774
  • Accuracy Fabric Cloth: 0.8431
  • Iou Fiberglass Wool: 0.0
  • Accuracy Fiberglass Wool: 0.0
  • Iou Fire: 0.0462
  • Accuracy Fire: 0.0485
  • Iou Foliage: 0.8463
  • Accuracy Foliage: 0.9090
  • Iou Food: 0.8038
  • Accuracy Food: 0.8915
  • Iou Fur: 0.8120
  • Accuracy Fur: 0.9058
  • Iou Gemstone Quartz: 0.0
  • Accuracy Gemstone Quartz: 0.0
  • Iou Glass: 0.5701
  • Accuracy Glass: 0.6385
  • Iou Hair: 0.7468
  • Accuracy Hair: 0.8342
  • Iou Ice: 0.0
  • Accuracy Ice: 0.0
  • Iou Leather: 0.4000
  • Accuracy Leather: 0.4492
  • Iou Liquid Non-water: 0.0044
  • Accuracy Liquid Non-water: 0.0045
  • Iou Metal: 0.3217
  • Accuracy Metal: 0.3957
  • Iou Mirror: 0.4412
  • Accuracy Mirror: 0.4830
  • Iou Paint Plaster Enamel: 0.6995
  • Accuracy Paint Plaster Enamel: 0.7585
  • Iou Paper: 0.5897
  • Accuracy Paper: 0.6750
  • Iou Pearl: 0.0
  • Accuracy Pearl: 0.0
  • Iou Photograph Painting: 0.2360
  • Accuracy Photograph Painting: 0.2656
  • Iou Plastic Clear: 0.2166
  • Accuracy Plastic Clear: 0.2648
  • Iou Plastic Non-clear: 0.3274
  • Accuracy Plastic Non-clear: 0.3924
  • Iou Rubber Latex: 0.2258
  • Accuracy Rubber Latex: 0.2529
  • Iou Sand: 0.4271
  • Accuracy Sand: 0.5089
  • Iou Skin Lips: 0.7549
  • Accuracy Skin Lips: 0.8461
  • Iou Sky: 0.9267
  • Accuracy Sky: 0.9488
  • Iou Snow: 0.5493
  • Accuracy Snow: 0.6568
  • Iou Soap: 0.0
  • Accuracy Soap: 0.0
  • Iou Soil Mud: 0.4185
  • Accuracy Soil Mud: 0.6073
  • Iou Sponge: 0.0
  • Accuracy Sponge: 0.0
  • Iou Stone Natural: 0.4712
  • Accuracy Stone Natural: 0.5925
  • Iou Stone Polished: 0.1409
  • Accuracy Stone Polished: 0.1469
  • Iou Styrofoam: 0.0
  • Accuracy Styrofoam: 0.0
  • Iou Tile: 0.6098
  • Accuracy Tile: 0.6803
  • Iou Wallpaper: 0.4314
  • Accuracy Wallpaper: 0.5003
  • Iou Water: 0.8358
  • Accuracy Water: 0.9263
  • Iou Wax: 0.5110
  • Accuracy Wax: 0.5657
  • Iou Whiteboard: 0.6395
  • Accuracy Whiteboard: 0.7026
  • Iou Wicker: 0.4063
  • Accuracy Wicker: 0.4495
  • Iou Wood: 0.6898
  • Accuracy Wood: 0.7487
  • Iou Wood Tree: 0.3589
  • Accuracy Wood Tree: 0.4629
  • Iou Asphalt: 0.4000
  • Accuracy Asphalt: 0.4936
  • Loss: 275.7852
  • Eval Runtime: 36.0974
  • Eval Samples Per Second: 32.8
  • Eval Steps Per Second: 8.2

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 8
  • total_train_batch_size: 512
  • total_eval_batch_size: 256
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Mean Iou Mean Accuracy Overall Accuracy Iou Animal Skin Accuracy Animal Skin Iou Bone Teeth Horn Accuracy Bone Teeth Horn Iou Brickwork Accuracy Brickwork Iou Cardboard Accuracy Cardboard Iou Carpet Rug Accuracy Carpet Rug Iou Ceiling Tile Accuracy Ceiling Tile Iou Ceramic Accuracy Ceramic Iou Chalkboard Blackboard Accuracy Chalkboard Blackboard Iou Clutter Accuracy Clutter Iou Concrete Accuracy Concrete Iou Cork Corkboard Accuracy Cork Corkboard Iou Engineered Stone Accuracy Engineered Stone Iou Fabric Cloth Accuracy Fabric Cloth Iou Fiberglass Wool Accuracy Fiberglass Wool Iou Fire Accuracy Fire Iou Foliage Accuracy Foliage Iou Food Accuracy Food Iou Fur Accuracy Fur Iou Gemstone Quartz Accuracy Gemstone Quartz Iou Glass Accuracy Glass Iou Hair Accuracy Hair Iou Ice Accuracy Ice Iou Leather Accuracy Leather Iou Liquid Non-water Accuracy Liquid Non-water Iou Metal Accuracy Metal Iou Mirror Accuracy Mirror Iou Paint Plaster Enamel Accuracy Paint Plaster Enamel Iou Paper Accuracy Paper Iou Pearl Accuracy Pearl Iou Photograph Painting Accuracy Photograph Painting Iou Plastic Clear Accuracy Plastic Clear Iou Plastic Non-clear Accuracy Plastic Non-clear Iou Rubber Latex Accuracy Rubber Latex Iou Sand Accuracy Sand Iou Skin Lips Accuracy Skin Lips Iou Sky Accuracy Sky Iou Snow Accuracy Snow Iou Soap Accuracy Soap Iou Soil Mud Accuracy Soil Mud Iou Sponge Accuracy Sponge Iou Stone Natural Accuracy Stone Natural Iou Stone Polished Accuracy Stone Polished Iou Styrofoam Accuracy Styrofoam Iou Tile Accuracy Tile Iou Wallpaper Accuracy Wallpaper Iou Water Accuracy Water Iou Wax Accuracy Wax Iou Whiteboard Accuracy Whiteboard Iou Wicker Accuracy Wicker Iou Wood Accuracy Wood Iou Wood Tree Accuracy Wood Tree Iou Asphalt Accuracy Asphalt Validation Loss Eval Runtime Eval Samples Per Second Eval Steps Per Second
301.1912 3.4091 150 0.2898 0.3579 0.7339 0.0076 0.0076 0.0377 0.0393 0.3809 0.4826 0.3801 0.4998 0.6153 0.6646 0.7289 0.8201 0.6005 0.7090 0.1974 0.2470 0.0 0.0 0.0980 0.1084 0.0 0.0 0.0 0.0 0.7572 0.8487 0.0 0.0 0.0 0.0 0.8214 0.9371 0.6580 0.8915 0.6749 0.8054 0.0 0.0 0.5515 0.6922 0.6846 0.7658 0.0 0.0 0.1850 0.1962 0.0 0.0 0.1701 0.1861 0.4358 0.4965 0.7083 0.8100 0.5016 0.6030 0.0 0.0 0.2623 0.3442 0.0906 0.1053 0.3206 0.4415 0.1005 0.1111 0.0 0.0 0.6904 0.8394 0.9211 0.9453 0.0 0.0 0.0 0.0 0.1476 0.1718 0.0 0.0 0.3222 0.6059 0.0476 0.0493 0.0 0.0 0.5149 0.7829 0.0990 0.2126 0.6378 0.9376 0.0 0.0 0.3753 0.5290 0.3661 0.4363 0.6920 0.8453 0.2524 0.4036 0.0336 0.0369 275.1264 41.4146 28.589 7.147
301.1912 3.4091 150 0.2898 0.3579 0.7339 0.0076 0.0076 0.0377 0.0393 0.3809 0.4826 0.3801 0.4998 0.6153 0.6646 0.7289 0.8201 0.6005 0.7090 0.1974 0.2470 0.0 0.0 0.0980 0.1084 0.0 0.0 0.0 0.0 0.7572 0.8487 0.0 0.0 0.0 0.0 0.8214 0.9371 0.6580 0.8915 0.6749 0.8054 0.0 0.0 0.5515 0.6922 0.6846 0.7658 0.0 0.0 0.1850 0.1962 0.0 0.0 0.1701 0.1861 0.4358 0.4965 0.7083 0.8100 0.5016 0.6030 0.0 0.0 0.2623 0.3442 0.0906 0.1053 0.3206 0.4415 0.1005 0.1111 0.0 0.0 0.6904 0.8394 0.9211 0.9453 0.0 0.0 0.0 0.0 0.1476 0.1718 0.0 0.0 0.3222 0.6059 0.0476 0.0493 0.0 0.0 0.5149 0.7829 0.0990 0.2126 0.6378 0.9376 0.0 0.0 0.3753 0.5290 0.3661 0.4363 0.6920 0.8453 0.2524 0.4036 0.0336 0.0369 275.1264 41.4146 28.589 7.147
260.4968 6.8182 300 0.3046 0.3354 0.6998 0.0306 0.0313 0.0838 0.0897 0.4298 0.4795 0.3178 0.3286 0.5790 0.6100 0.7563 0.8095 0.6313 0.6848 0.0988 0.0988 0.0 0.0 0.1094 0.1207 0.0 0.0 0.0 0.0 0.7643 0.8200 0.0 0.0 0.0 0.0 0.8398 0.9283 0.7867 0.8784 0.7336 0.8030 0.0 0.0 0.5445 0.6045 0.7326 0.8224 0.0 0.0 0.3016 0.3129 0.0 0.0 0.2118 0.2253 0.4509 0.5097 0.7097 0.7853 0.5292 0.5938 0.0 0.0 0.0297 0.0299 0.0998 0.1036 0.3123 0.3692 0.1398 0.1453 0.0000 0.0000 0.7189 0.8138 0.9222 0.9453 0.0 0.0 0.0 0.0 0.2938 0.3435 0.0 0.0 0.3864 0.4797 0.0082 0.0082 0.0 0.0 0.5603 0.6100 0.0918 0.0925 0.8343 0.9162 0.0 0.0 0.2328 0.2346 0.2893 0.2946 0.7171 0.8185 0.2034 0.2094 0.3567 0.4905 267.0715 36.1774 32.728 8.182
260.4968 6.8182 300 0.3046 0.3354 0.6998 0.0306 0.0313 0.0838 0.0897 0.4298 0.4795 0.3178 0.3286 0.5790 0.6100 0.7563 0.8095 0.6313 0.6848 0.0988 0.0988 0.0 0.0 0.1094 0.1207 0.0 0.0 0.0 0.0 0.7643 0.8200 0.0 0.0 0.0 0.0 0.8398 0.9283 0.7867 0.8784 0.7336 0.8030 0.0 0.0 0.5445 0.6045 0.7326 0.8224 0.0 0.0 0.3016 0.3129 0.0 0.0 0.2118 0.2253 0.4509 0.5097 0.7097 0.7853 0.5292 0.5938 0.0 0.0 0.0297 0.0299 0.0998 0.1036 0.3123 0.3692 0.1398 0.1453 0.0000 0.0000 0.7189 0.8138 0.9222 0.9453 0.0 0.0 0.0 0.0 0.2938 0.3435 0.0 0.0 0.3864 0.4797 0.0082 0.0082 0.0 0.0 0.5603 0.6100 0.0918 0.0925 0.8343 0.9162 0.0 0.0 0.2328 0.2346 0.2893 0.2946 0.7171 0.8185 0.2034 0.2094 0.3567 0.4905 267.0715 36.1774 32.728 8.182
239.0256 10.2273 450 0.3383 0.3755 0.7218 0.0175 0.0177 0.0970 0.1027 0.4031 0.4319 0.4332 0.4707 0.6502 0.6834 0.7271 0.7624 0.6117 0.7097 0.4543 0.4716 0.0 0.0 0.0872 0.0940 0.0 0.0 0.0 0.0 0.7793 0.8516 0.0 0.0 0.0 0.0 0.8402 0.9010 0.7756 0.8417 0.8023 0.8929 0.0 0.0 0.5819 0.6892 0.7348 0.8024 0.0 0.0 0.4064 0.4350 0.0 0.0 0.2974 0.3458 0.4461 0.4729 0.7118 0.8044 0.5731 0.6437 0.0 0.0 0.1389 0.1436 0.1583 0.1797 0.2511 0.2724 0.2062 0.2215 0.0121 0.0121 0.7291 0.8328 0.9191 0.9364 0.0608 0.0608 0.0 0.0 0.3750 0.5687 0.0 0.0 0.3882 0.4414 0.0206 0.0206 0.0 0.0 0.5869 0.6600 0.1148 0.1163 0.8138 0.8746 0.0192 0.0193 0.5996 0.7349 0.3965 0.4299 0.7229 0.8147 0.3473 0.4139 0.3010 0.3459 267.4137 35.7406 33.128 8.282
239.0256 10.2273 450 0.3383 0.3755 0.7218 0.0175 0.0177 0.0970 0.1027 0.4031 0.4319 0.4332 0.4707 0.6502 0.6834 0.7271 0.7624 0.6117 0.7097 0.4543 0.4716 0.0 0.0 0.0872 0.0940 0.0 0.0 0.0 0.0 0.7793 0.8516 0.0 0.0 0.0 0.0 0.8402 0.9010 0.7756 0.8417 0.8023 0.8929 0.0 0.0 0.5819 0.6892 0.7348 0.8024 0.0 0.0 0.4064 0.4350 0.0 0.0 0.2974 0.3458 0.4461 0.4729 0.7118 0.8044 0.5731 0.6437 0.0 0.0 0.1389 0.1436 0.1583 0.1797 0.2511 0.2724 0.2062 0.2215 0.0121 0.0121 0.7291 0.8328 0.9191 0.9364 0.0608 0.0608 0.0 0.0 0.3750 0.5687 0.0 0.0 0.3882 0.4414 0.0206 0.0206 0.0 0.0 0.5869 0.6600 0.1148 0.1163 0.8138 0.8746 0.0192 0.0193 0.5996 0.7349 0.3965 0.4299 0.7229 0.8147 0.3473 0.4139 0.3010 0.3459 267.4137 35.7406 33.128 8.282
222.0586 13.6364 600 0.3698 0.4166 0.7196 0.0861 0.0934 0.0360 0.0896 0.5652 0.6291 0.4632 0.5218 0.7187 0.7899 0.7206 0.7589 0.6431 0.7102 0.4651 0.5623 0.0 0.0 0.2383 0.2929 0.0 0.0 0.0 0.0 0.7820 0.8664 0.0 0.0 0.0175 0.0209 0.8470 0.9256 0.7711 0.8989 0.7846 0.8931 0.0 0.0 0.5551 0.6232 0.7423 0.8347 0.0 0.0 0.3623 0.3839 0.0 0.0 0.3158 0.3872 0.4609 0.5365 0.7046 0.7662 0.5729 0.6426 0.0 0.0 0.2096 0.2288 0.1805 0.2238 0.3135 0.3662 0.2329 0.2539 0.2801 0.2984 0.7432 0.8178 0.9200 0.9356 0.4617 0.5381 0.0 0.0 0.4072 0.5424 0.0 0.0 0.4545 0.5667 0.0835 0.0849 0.0 0.0 0.6026 0.6611 0.3280 0.3462 0.8343 0.9188 0.0601 0.0620 0.4934 0.5363 0.3332 0.3892 0.6804 0.7272 0.3693 0.4344 0.3891 0.5055 266.5801 38.0796 31.093 7.773
222.0586 13.6364 600 0.3698 0.4166 0.7196 0.0861 0.0934 0.0360 0.0896 0.5652 0.6291 0.4632 0.5218 0.7187 0.7899 0.7206 0.7589 0.6431 0.7102 0.4651 0.5623 0.0 0.0 0.2383 0.2929 0.0 0.0 0.0 0.0 0.7820 0.8664 0.0 0.0 0.0175 0.0209 0.8470 0.9256 0.7711 0.8989 0.7846 0.8931 0.0 0.0 0.5551 0.6232 0.7423 0.8347 0.0 0.0 0.3623 0.3839 0.0 0.0 0.3158 0.3872 0.4609 0.5365 0.7046 0.7662 0.5729 0.6426 0.0 0.0 0.2096 0.2288 0.1805 0.2238 0.3135 0.3662 0.2329 0.2539 0.2801 0.2984 0.7432 0.8178 0.9200 0.9356 0.4617 0.5381 0.0 0.0 0.4072 0.5424 0.0 0.0 0.4545 0.5667 0.0835 0.0849 0.0 0.0 0.6026 0.6611 0.3280 0.3462 0.8343 0.9188 0.0601 0.0620 0.4934 0.5363 0.3332 0.3892 0.6804 0.7272 0.3693 0.4344 0.3891 0.5055 266.5801 38.0796 31.093 7.773
206.5963 17.0455 750 0.3955 0.4451 0.7415 0.1809 0.1993 0.1401 0.1529 0.5513 0.5990 0.4418 0.4746 0.7173 0.8341 0.7474 0.8369 0.6442 0.7204 0.5513 0.5735 0.0 0.0 0.2801 0.3337 0.0 0.0 0.0 0.0 0.7832 0.8633 0.0 0.0 0.0207 0.0207 0.8465 0.9249 0.7911 0.8821 0.8109 0.9107 0.0 0.0 0.5829 0.6868 0.7461 0.8428 0.0 0.0 0.4063 0.4521 0.0007 0.0007 0.3139 0.3716 0.4426 0.4710 0.7191 0.8007 0.5785 0.6441 0.0 0.0 0.2647 0.3080 0.2321 0.2820 0.2952 0.3358 0.2362 0.2581 0.2754 0.2931 0.7447 0.8239 0.9270 0.9498 0.4887 0.5510 0.0 0.0 0.4002 0.5975 0.0 0.0 0.4574 0.5504 0.0919 0.0939 0.0 0.0 0.6200 0.6975 0.3868 0.4270 0.8199 0.9360 0.4531 0.4773 0.6314 0.6898 0.4100 0.4456 0.6974 0.7542 0.4040 0.4736 0.4340 0.6047 271.1522 38.0305 31.133 7.783
206.5963 17.0455 750 0.3955 0.4451 0.7415 0.1809 0.1993 0.1401 0.1529 0.5513 0.5990 0.4418 0.4746 0.7173 0.8341 0.7474 0.8369 0.6442 0.7204 0.5513 0.5735 0.0 0.0 0.2801 0.3337 0.0 0.0 0.0 0.0 0.7832 0.8633 0.0 0.0 0.0207 0.0207 0.8465 0.9249 0.7911 0.8821 0.8109 0.9107 0.0 0.0 0.5829 0.6868 0.7461 0.8428 0.0 0.0 0.4063 0.4521 0.0007 0.0007 0.3139 0.3716 0.4426 0.4710 0.7191 0.8007 0.5785 0.6441 0.0 0.0 0.2647 0.3080 0.2321 0.2820 0.2952 0.3358 0.2362 0.2581 0.2754 0.2931 0.7447 0.8239 0.9270 0.9498 0.4887 0.5510 0.0 0.0 0.4002 0.5975 0.0 0.0 0.4574 0.5504 0.0919 0.0939 0.0 0.0 0.6200 0.6975 0.3868 0.4270 0.8199 0.9360 0.4531 0.4773 0.6314 0.6898 0.4100 0.4456 0.6974 0.7542 0.4040 0.4736 0.4340 0.6047 271.1522 38.0305 31.133 7.783
193.3702 20.4545 900 0.3967 0.4483 0.7399 0.1524 0.1694 0.1103 0.1139 0.5465 0.6060 0.4627 0.5054 0.7191 0.7981 0.7483 0.8114 0.6407 0.7360 0.5216 0.5561 0.0 0.0 0.3076 0.3667 0.0 0.0 0.0 0.0 0.7798 0.8437 0.0 0.0 0.0254 0.0292 0.8480 0.9116 0.8056 0.8972 0.8055 0.9046 0.0 0.0 0.5787 0.6535 0.7495 0.8437 0.0 0.0 0.4007 0.4610 0.0112 0.0113 0.3172 0.3811 0.4648 0.5211 0.7213 0.8010 0.5872 0.6782 0.0 0.0 0.2742 0.3268 0.1982 0.2305 0.3129 0.3650 0.2635 0.3019 0.4267 0.4793 0.7579 0.8575 0.9318 0.9566 0.5354 0.6342 0.0 0.0 0.4134 0.6399 0.0 0.0 0.4724 0.5782 0.1126 0.1160 0.0 0.0 0.6013 0.6618 0.3741 0.4238 0.8299 0.9158 0.2934 0.3186 0.6068 0.6560 0.4164 0.4620 0.7014 0.7663 0.3946 0.5005 0.4057 0.5231 271.1607 39.0896 30.289 7.572
193.3702 20.4545 900 0.3967 0.4483 0.7399 0.1524 0.1694 0.1103 0.1139 0.5465 0.6060 0.4627 0.5054 0.7191 0.7981 0.7483 0.8114 0.6407 0.7360 0.5216 0.5561 0.0 0.0 0.3076 0.3667 0.0 0.0 0.0 0.0 0.7798 0.8437 0.0 0.0 0.0254 0.0292 0.8480 0.9116 0.8056 0.8972 0.8055 0.9046 0.0 0.0 0.5787 0.6535 0.7495 0.8437 0.0 0.0 0.4007 0.4610 0.0112 0.0113 0.3172 0.3811 0.4648 0.5211 0.7213 0.8010 0.5872 0.6782 0.0 0.0 0.2742 0.3268 0.1982 0.2305 0.3129 0.3650 0.2635 0.3019 0.4267 0.4793 0.7579 0.8575 0.9318 0.9566 0.5354 0.6342 0.0 0.0 0.4134 0.6399 0.0 0.0 0.4724 0.5782 0.1126 0.1160 0.0 0.0 0.6013 0.6618 0.3741 0.4238 0.8299 0.9158 0.2934 0.3186 0.6068 0.6560 0.4164 0.4620 0.7014 0.7663 0.3946 0.5005 0.4057 0.5231 271.1607 39.0896 30.289 7.572
183.5936 23.8636 1050 0.4069 0.4618 0.7354 0.2576 0.2970 0.1262 0.1346 0.5613 0.6330 0.4349 0.4705 0.7276 0.8065 0.7327 0.7982 0.6408 0.7402 0.6337 0.6707 0.0 0.0 0.3091 0.3823 0.0 0.0 0.0 0.0 0.7847 0.8541 0.0 0.0 0.0452 0.0474 0.8452 0.9112 0.8090 0.8999 0.8066 0.9083 0.0000 0.0000 0.5808 0.6554 0.7493 0.8395 0.0 0.0 0.4007 0.4618 0.0044 0.0045 0.3328 0.4191 0.4676 0.5116 0.7034 0.7648 0.5888 0.6803 0.0 0.0 0.2512 0.2936 0.2021 0.2331 0.3276 0.3952 0.2244 0.2472 0.3855 0.4641 0.7585 0.8530 0.9263 0.9485 0.5453 0.6536 0.0 0.0 0.4139 0.6280 0.0 0.0 0.4845 0.6076 0.1648 0.1725 0.0 0.0 0.6241 0.7026 0.4330 0.5357 0.8294 0.9230 0.5167 0.5681 0.6474 0.7164 0.4152 0.4517 0.7004 0.7620 0.3544 0.4362 0.4133 0.5285 273.2441 38.3911 30.84 7.71
183.5936 23.8636 1050 0.4069 0.4618 0.7354 0.2576 0.2970 0.1262 0.1346 0.5613 0.6330 0.4349 0.4705 0.7276 0.8065 0.7327 0.7982 0.6408 0.7402 0.6337 0.6707 0.0 0.0 0.3091 0.3823 0.0 0.0 0.0 0.0 0.7847 0.8541 0.0 0.0 0.0452 0.0474 0.8452 0.9112 0.8090 0.8999 0.8066 0.9083 0.0000 0.0000 0.5808 0.6554 0.7493 0.8395 0.0 0.0 0.4007 0.4618 0.0044 0.0045 0.3328 0.4191 0.4676 0.5116 0.7034 0.7648 0.5888 0.6803 0.0 0.0 0.2512 0.2936 0.2021 0.2331 0.3276 0.3952 0.2244 0.2472 0.3855 0.4641 0.7585 0.8530 0.9263 0.9485 0.5453 0.6536 0.0 0.0 0.4139 0.6280 0.0 0.0 0.4845 0.6076 0.1648 0.1725 0.0 0.0 0.6241 0.7026 0.4330 0.5357 0.8294 0.9230 0.5167 0.5681 0.6474 0.7164 0.4152 0.4517 0.7004 0.7620 0.3544 0.4362 0.4133 0.5285 273.2441 38.3911 30.84 7.71
177.4213 27.2727 1200 0.4055 0.4571 0.7261 0.2844 0.3396 0.1287 0.1353 0.5506 0.6007 0.4753 0.5163 0.7248 0.7993 0.7318 0.7822 0.6421 0.7324 0.6229 0.6613 0.0 0.0 0.2897 0.3533 0.0 0.0 0.0 0.0 0.7774 0.8431 0.0 0.0 0.0462 0.0485 0.8463 0.9090 0.8038 0.8915 0.8120 0.9058 0.0 0.0 0.5701 0.6385 0.7468 0.8342 0.0 0.0 0.4000 0.4492 0.0044 0.0045 0.3217 0.3957 0.4412 0.4830 0.6995 0.7585 0.5897 0.6750 0.0 0.0 0.2360 0.2656 0.2166 0.2648 0.3274 0.3924 0.2258 0.2529 0.4271 0.5089 0.7549 0.8461 0.9267 0.9488 0.5493 0.6568 0.0 0.0 0.4185 0.6073 0.0 0.0 0.4712 0.5925 0.1409 0.1469 0.0 0.0 0.6098 0.6803 0.4314 0.5003 0.8358 0.9263 0.5110 0.5657 0.6395 0.7026 0.4063 0.4495 0.6898 0.7487 0.3589 0.4629 0.4000 0.4936 275.7852 36.0974 32.8 8.2
177.4213 27.2727 1200 0.4055 0.4571 0.7261 0.2844 0.3396 0.1287 0.1353 0.5506 0.6007 0.4753 0.5163 0.7248 0.7993 0.7318 0.7822 0.6421 0.7324 0.6229 0.6613 0.0 0.0 0.2897 0.3533 0.0 0.0 0.0 0.0 0.7774 0.8431 0.0 0.0 0.0462 0.0485 0.8463 0.9090 0.8038 0.8915 0.8120 0.9058 0.0 0.0 0.5701 0.6385 0.7468 0.8342 0.0 0.0 0.4000 0.4492 0.0044 0.0045 0.3217 0.3957 0.4412 0.4830 0.6995 0.7585 0.5897 0.6750 0.0 0.0 0.2360 0.2656 0.2166 0.2648 0.3274 0.3924 0.2258 0.2529 0.4271 0.5089 0.7549 0.8461 0.9267 0.9488 0.5493 0.6568 0.0 0.0 0.4185 0.6073 0.0 0.0 0.4712 0.5925 0.1409 0.1469 0.0 0.0 0.6098 0.6803 0.4314 0.5003 0.8358 0.9263 0.5110 0.5657 0.6395 0.7026 0.4063 0.4495 0.6898 0.7487 0.3589 0.4629 0.4000 0.4936 275.7852 36.0974 32.8 8.2

Framework versions

  • Transformers 5.0.0
  • Pytorch 2.9.1+cu128
  • Datasets 4.5.0
  • Tokenizers 0.22.2
Downloads last month
11
Safetensors
Model size
0.1B params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for AllanK24/mask2former-swin-base-apple-dms-run3

Finetuned
(3)
this model