Visualize in Weights & Biases

exceptions_exp2_swap_0.7_cost_to_carry_5039

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.5813
  • Accuracy: 0.3660

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0006
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 5039
  • gradient_accumulation_steps: 5
  • total_train_batch_size: 80
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 50.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
4.8495 0.2917 1000 4.7681 0.2531
4.3448 0.5834 2000 4.2971 0.2978
4.1576 0.8750 3000 4.1055 0.3141
4.0179 1.1665 4000 4.0037 0.3232
3.9363 1.4582 5000 3.9245 0.3301
3.8954 1.7499 6000 3.8654 0.3356
3.7561 2.0414 7000 3.8211 0.3404
3.7679 2.3331 8000 3.7919 0.3434
3.7508 2.6248 9000 3.7634 0.3460
3.7392 2.9165 10000 3.7330 0.3483
3.6454 3.2080 11000 3.7238 0.3501
3.6489 3.4996 12000 3.7063 0.3518
3.654 3.7913 13000 3.6838 0.3539
3.5456 4.0828 14000 3.6791 0.3552
3.5855 4.3745 15000 3.6678 0.3561
3.5876 4.6662 16000 3.6548 0.3573
3.5934 4.9579 17000 3.6400 0.3587
3.5238 5.2494 18000 3.6425 0.3591
3.5289 5.5411 19000 3.6325 0.3603
3.5296 5.8327 20000 3.6223 0.3610
3.4496 6.1243 21000 3.6246 0.3614
3.4715 6.4159 22000 3.6187 0.3620
3.5 6.7076 23000 3.6083 0.3626
3.4973 6.9993 24000 3.5957 0.3636
3.44 7.2908 25000 3.6067 0.3634
3.4574 7.5825 26000 3.5990 0.3643
3.4748 7.8742 27000 3.5894 0.3649
3.3905 8.1657 28000 3.5968 0.3648
3.4085 8.4574 29000 3.5889 0.3656
3.4401 8.7490 30000 3.5813 0.3660
3.3334 9.0405 31000 3.5882 0.3657
3.3827 9.3322 32000 3.5863 0.3661
3.4003 9.6239 33000 3.5744 0.3669
3.417 9.9156 34000 3.5681 0.3674
3.3425 10.2071 35000 3.5823 0.3671
3.389 10.4988 36000 3.5734 0.3675
3.4016 10.7905 37000 3.5658 0.3682
3.3073 11.0820 38000 3.5728 0.3680
3.3501 11.3736 39000 3.5740 0.3680
3.3746 11.6653 40000 3.5662 0.3688
3.3773 11.9570 41000 3.5572 0.3695
3.3141 12.2485 42000 3.5711 0.3684
3.3499 12.5402 43000 3.5616 0.3692
3.345 12.8319 44000 3.5553 0.3696
3.2648 13.1234 45000 3.5678 0.3693
3.3126 13.4151 46000 3.5607 0.3694
3.3294 13.7067 47000 3.5539 0.3703
3.3524 13.9984 48000 3.5477 0.3706
3.2961 14.2899 49000 3.5666 0.3698
3.3065 14.5816 50000 3.5565 0.3700
3.3238 14.8733 51000 3.5494 0.3708
3.2633 15.1648 52000 3.5653 0.3701
3.2915 15.4565 53000 3.5572 0.3710
3.3027 15.7482 54000 3.5493 0.3710
3.2114 16.0397 55000 3.5605 0.3707
3.2694 16.3313 56000 3.5598 0.3709
3.3015 16.6230 57000 3.5504 0.3713
3.2965 16.9147 58000 3.5434 0.3719
3.2518 17.2062 59000 3.5610 0.3710
3.2699 17.4979 60000 3.5537 0.3715
3.2833 17.7896 61000 3.5451 0.3720
3.205 18.0811 62000 3.5603 0.3713
3.2405 18.3728 63000 3.5543 0.3716
3.2622 18.6644 64000 3.5477 0.3721
3.2853 18.9561 65000 3.5399 0.3728
3.2136 19.2476 66000 3.5571 0.3717
3.2421 19.5393 67000 3.5518 0.3718
3.247 19.8310 68000 3.5424 0.3725
3.1915 20.1225 69000 3.5618 0.3720
3.2315 20.4142 70000 3.5545 0.3723
3.2387 20.7059 71000 3.5445 0.3727
3.2681 20.9975 72000 3.5392 0.3732
3.1985 21.2891 73000 3.5578 0.3719
3.2263 21.5807 74000 3.5477 0.3727
3.241 21.8724 75000 3.5404 0.3730
3.1623 22.1639 76000 3.5533 0.3726
3.2073 22.4556 77000 3.5501 0.3726
3.2213 22.7473 78000 3.5449 0.3731
3.1494 23.0388 79000 3.5543 0.3727
3.1892 23.3305 80000 3.5524 0.3726
3.211 23.6222 81000 3.5505 0.3730
3.2331 23.9138 82000 3.5388 0.3735
3.1621 24.2053 83000 3.5566 0.3725
3.2148 24.4970 84000 3.5493 0.3731
3.206 24.7887 85000 3.5413 0.3736
3.12 25.0802 86000 3.5574 0.3730
3.1795 25.3719 87000 3.5528 0.3731
3.1956 25.6636 88000 3.5424 0.3738
3.2272 25.9553 89000 3.5368 0.3739
3.1544 26.2468 90000 3.5519 0.3732
3.1801 26.5384 91000 3.5500 0.3734
3.2106 26.8301 92000 3.5401 0.3741
3.1296 27.1216 93000 3.5584 0.3732
3.1701 27.4133 94000 3.5518 0.3735
3.1742 27.7050 95000 3.5419 0.3741
3.1993 27.9967 96000 3.5368 0.3742
3.1572 28.2882 97000 3.5559 0.3732
3.169 28.5799 98000 3.5508 0.3736
3.189 28.8715 99000 3.5381 0.3743
3.1151 29.1630 100000 3.5557 0.3736
3.1515 29.4547 101000 3.5521 0.3739
3.1737 29.7464 102000 3.5451 0.3743
3.0872 30.0379 103000 3.5587 0.3737
3.1245 30.3296 104000 3.5546 0.3737
3.15 30.6213 105000 3.5496 0.3740
3.1683 30.9130 106000 3.5438 0.3744
3.0999 31.2045 107000 3.5577 0.3737
3.1229 31.4961 108000 3.5498 0.3743
3.1502 31.7878 109000 3.5434 0.3747

Framework versions

  • Transformers 4.55.2
  • Pytorch 2.8.0+cu128
  • Datasets 4.0.0
  • Tokenizers 0.21.4
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support