Visualize in Weights & Biases

exceptions_exp2_swap_0.7_cost_to_push_40817

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.5674
  • Accuracy: 0.3684

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0006
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 40817
  • gradient_accumulation_steps: 5
  • total_train_batch_size: 80
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 50.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
4.8317 0.2917 1000 4.7685 0.2521
4.3567 0.5834 2000 4.2898 0.2977
4.1606 0.8750 3000 4.1113 0.3140
4.0129 1.1665 4000 3.9987 0.3240
3.9405 1.4582 5000 3.9221 0.3309
3.8841 1.7499 6000 3.8658 0.3359
3.7634 2.0414 7000 3.8191 0.3401
3.7623 2.3331 8000 3.7927 0.3433
3.7502 2.6248 9000 3.7612 0.3459
3.7213 2.9165 10000 3.7356 0.3486
3.6448 3.2080 11000 3.7245 0.3500
3.647 3.4996 12000 3.7068 0.3520
3.6607 3.7913 13000 3.6865 0.3537
3.5566 4.0828 14000 3.6816 0.3546
3.5605 4.3745 15000 3.6713 0.3559
3.5792 4.6662 16000 3.6548 0.3572
3.5696 4.9579 17000 3.6423 0.3584
3.5114 5.2494 18000 3.6454 0.3586
3.5205 5.5411 19000 3.6344 0.3596
3.5466 5.8327 20000 3.6221 0.3605
3.4599 6.1243 21000 3.6298 0.3610
3.4778 6.4159 22000 3.6188 0.3618
3.4998 6.7076 23000 3.6094 0.3622
3.5083 6.9993 24000 3.5975 0.3632
3.4246 7.2908 25000 3.6066 0.3633
3.4786 7.5825 26000 3.6000 0.3640
3.4675 7.8742 27000 3.5897 0.3647
3.3767 8.1657 28000 3.5985 0.3644
3.4312 8.4574 29000 3.5922 0.3650
3.442 8.7490 30000 3.5836 0.3658
3.3449 9.0405 31000 3.5919 0.3653
3.3924 9.3322 32000 3.5859 0.3661
3.4126 9.6239 33000 3.5795 0.3666
3.4162 9.9156 34000 3.5690 0.3673
3.3491 10.2071 35000 3.5839 0.3665
3.3755 10.4988 36000 3.5739 0.3672
3.397 10.7905 37000 3.5678 0.3680
3.3174 11.0820 38000 3.5801 0.3672
3.3318 11.3736 39000 3.5742 0.3674
3.3597 11.6653 40000 3.5674 0.3684
3.3895 11.9570 41000 3.5598 0.3689
3.334 12.2485 42000 3.5707 0.3684
3.3507 12.5402 43000 3.5640 0.3690
3.3587 12.8319 44000 3.5582 0.3692
3.2946 13.1234 45000 3.5682 0.3690
3.3207 13.4151 46000 3.5656 0.3692
3.3373 13.7067 47000 3.5574 0.3697
3.3516 13.9984 48000 3.5520 0.3703
3.301 14.2899 49000 3.5653 0.3693
3.3261 14.5816 50000 3.5611 0.3699
3.3368 14.8733 51000 3.5524 0.3705
3.2546 15.1648 52000 3.5666 0.3697
3.2967 15.4565 53000 3.5616 0.3702
3.3185 15.7482 54000 3.5524 0.3704
3.2039 16.0397 55000 3.5613 0.3704
3.2676 16.3313 56000 3.5596 0.3705
3.2911 16.6230 57000 3.5556 0.3706
3.3108 16.9147 58000 3.5478 0.3711
3.2358 17.2062 59000 3.5619 0.3707
3.2703 17.4979 60000 3.5540 0.3712
3.2906 17.7896 61000 3.5512 0.3713
3.1964 18.0811 62000 3.5649 0.3708
3.2472 18.3728 63000 3.5604 0.3710
3.2495 18.6644 64000 3.5500 0.3717
3.2898 18.9561 65000 3.5430 0.3720
3.2198 19.2476 66000 3.5600 0.3713
3.2476 19.5393 67000 3.5538 0.3715
3.2666 19.8310 68000 3.5440 0.3721
3.1929 20.1225 69000 3.5635 0.3714
3.2104 20.4142 70000 3.5590 0.3713
3.2562 20.7059 71000 3.5455 0.3720
3.2735 20.9975 72000 3.5410 0.3727
3.2084 21.2891 73000 3.5614 0.3716
3.2305 21.5807 74000 3.5505 0.3722
3.2621 21.8724 75000 3.5427 0.3725
3.1855 22.1639 76000 3.5604 0.3720
3.218 22.4556 77000 3.5500 0.3723
3.2207 22.7473 78000 3.5457 0.3730
3.1523 23.0388 79000 3.5590 0.3721
3.2082 23.3305 80000 3.5572 0.3722
3.2118 23.6222 81000 3.5476 0.3727
3.219 23.9138 82000 3.5416 0.3732
3.1723 24.2053 83000 3.5599 0.3722
3.1917 24.4970 84000 3.5534 0.3730
3.2127 24.7887 85000 3.5454 0.3731
3.1432 25.0802 86000 3.5624 0.3723
3.1864 25.3719 87000 3.5579 0.3726
3.2011 25.6636 88000 3.5502 0.3730
3.2208 25.9553 89000 3.5391 0.3736
3.1548 26.2468 90000 3.5585 0.3727
3.1786 26.5384 91000 3.5479 0.3731
3.1971 26.8301 92000 3.5435 0.3734
3.1445 27.1216 93000 3.5622 0.3724
3.1676 27.4133 94000 3.5564 0.3731
3.1797 27.7050 95000 3.5484 0.3734
3.2029 27.9967 96000 3.5410 0.3735
3.1359 28.2882 97000 3.5566 0.3731
3.1758 28.5799 98000 3.5510 0.3733
3.1902 28.8715 99000 3.5432 0.3737
3.1161 29.1630 100000 3.5599 0.3728
3.1439 29.4547 101000 3.5518 0.3734
3.1657 29.7464 102000 3.5451 0.3738
3.0772 30.0379 103000 3.5603 0.3732
3.1417 30.3296 104000 3.5578 0.3732
3.1599 30.6213 105000 3.5503 0.3738
3.1691 30.9130 106000 3.5438 0.3740
3.0977 31.2045 107000 3.5578 0.3731
3.1382 31.4961 108000 3.5545 0.3737
3.1528 31.7878 109000 3.5468 0.3739

Framework versions

  • Transformers 4.55.2
  • Pytorch 2.8.0+cu128
  • Datasets 4.0.0
  • Tokenizers 0.21.4
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support