Visualize in Weights & Biases

exceptions_exp2_swap_0.7_cost_to_drop_5039

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.5645
  • Accuracy: 0.3686

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0006
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 5039
  • gradient_accumulation_steps: 5
  • total_train_batch_size: 80
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 50.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
4.8235 0.2917 1000 4.7495 0.2552
4.3462 0.5834 2000 4.2916 0.2988
4.1645 0.8750 3000 4.1072 0.3144
4.0081 1.1665 4000 3.9962 0.3242
3.9471 1.4582 5000 3.9234 0.3309
3.8937 1.7499 6000 3.8664 0.3353
3.7596 2.0414 7000 3.8213 0.3403
3.762 2.3331 8000 3.7930 0.3430
3.7331 2.6248 9000 3.7621 0.3462
3.7349 2.9165 10000 3.7355 0.3487
3.6362 3.2080 11000 3.7221 0.3505
3.6457 3.4996 12000 3.7049 0.3522
3.6514 3.7913 13000 3.6858 0.3539
3.5526 4.0828 14000 3.6795 0.3549
3.5965 4.3745 15000 3.6687 0.3559
3.5821 4.6662 16000 3.6562 0.3573
3.579 4.9579 17000 3.6405 0.3585
3.5144 5.2494 18000 3.6428 0.3592
3.5321 5.5411 19000 3.6311 0.3600
3.5373 5.8327 20000 3.6227 0.3609
3.4516 6.1243 21000 3.6268 0.3612
3.4813 6.4159 22000 3.6168 0.3618
3.4953 6.7076 23000 3.6082 0.3627
3.4995 6.9993 24000 3.5997 0.3632
3.4379 7.2908 25000 3.6082 0.3634
3.4676 7.5825 26000 3.5976 0.3641
3.4706 7.8742 27000 3.5889 0.3651
3.3905 8.1657 28000 3.5985 0.3648
3.4272 8.4574 29000 3.5909 0.3652
3.4349 8.7490 30000 3.5824 0.3661
3.3349 9.0405 31000 3.5871 0.3661
3.3855 9.3322 32000 3.5864 0.3661
3.4069 9.6239 33000 3.5773 0.3665
3.429 9.9156 34000 3.5690 0.3676
3.3396 10.2071 35000 3.5845 0.3668
3.3802 10.4988 36000 3.5740 0.3674
3.3967 10.7905 37000 3.5667 0.3680
3.3041 11.0820 38000 3.5735 0.3681
3.3322 11.3736 39000 3.5746 0.3679
3.3694 11.6653 40000 3.5645 0.3686
3.3702 11.9570 41000 3.5579 0.3689
3.3186 12.2485 42000 3.5710 0.3687
3.3503 12.5402 43000 3.5659 0.3690
3.3611 12.8319 44000 3.5579 0.3696
3.2771 13.1234 45000 3.5718 0.3690
3.3066 13.4151 46000 3.5658 0.3692
3.3455 13.7067 47000 3.5570 0.3700
3.3451 13.9984 48000 3.5483 0.3702
3.2904 14.2899 49000 3.5660 0.3694
3.3167 14.5816 50000 3.5563 0.3700
3.3306 14.8733 51000 3.5501 0.3707
3.2528 15.1648 52000 3.5641 0.3699
3.2853 15.4565 53000 3.5608 0.3704
3.3115 15.7482 54000 3.5489 0.3711
3.2167 16.0397 55000 3.5650 0.3704
3.2616 16.3313 56000 3.5638 0.3707
3.279 16.6230 57000 3.5539 0.3711
3.3055 16.9147 58000 3.5442 0.3714
3.2311 17.2062 59000 3.5607 0.3706
3.2687 17.4979 60000 3.5537 0.3712
3.2897 17.7896 61000 3.5465 0.3717
3.1981 18.0811 62000 3.5624 0.3711
3.2564 18.3728 63000 3.5579 0.3714
3.2645 18.6644 64000 3.5481 0.3715
3.2763 18.9561 65000 3.5416 0.3723
3.2251 19.2476 66000 3.5598 0.3715
3.2513 19.5393 67000 3.5499 0.3719
3.2806 19.8310 68000 3.5451 0.3723
3.1922 20.1225 69000 3.5614 0.3715
3.226 20.4142 70000 3.5556 0.3719
3.2528 20.7059 71000 3.5446 0.3723
3.2603 20.9975 72000 3.5390 0.3727
3.207 21.2891 73000 3.5593 0.3719
3.2368 21.5807 74000 3.5521 0.3722
3.2494 21.8724 75000 3.5417 0.3730
3.1753 22.1639 76000 3.5601 0.3718
3.2161 22.4556 77000 3.5512 0.3725
3.227 22.7473 78000 3.5419 0.3729
3.1477 23.0388 79000 3.5599 0.3723
3.1914 23.3305 80000 3.5553 0.3725
3.2168 23.6222 81000 3.5480 0.3728
3.2271 23.9138 82000 3.5429 0.3731
3.1566 24.2053 83000 3.5582 0.3724
3.2024 24.4970 84000 3.5524 0.3726
3.2178 24.7887 85000 3.5462 0.3732
3.1374 25.0802 86000 3.5614 0.3724
3.1895 25.3719 87000 3.5549 0.3729
3.1943 25.6636 88000 3.5485 0.3732
3.2212 25.9553 89000 3.5405 0.3736
3.1513 26.2468 90000 3.5602 0.3727
3.1832 26.5384 91000 3.5531 0.3733
3.2104 26.8301 92000 3.5431 0.3737

Framework versions

  • Transformers 4.55.2
  • Pytorch 2.8.0+cu128
  • Datasets 4.0.0
  • Tokenizers 0.21.4
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support