ARC-Challenge_Llama-3.2-1B-zfgomj43
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.5531
- Model Preparation Time: 0.0057
- Mdl: 669.9700
- Accumulated Loss: 464.3878
- Correct Preds: 132.0
- Total Preds: 299.0
- Accuracy: 0.4415
- Correct Gen Preds: 114.0
- Gen Accuracy: 0.3813
- Correct Gen Preds 32: 12.0
- Correct Preds 32: 20.0
- Total Labels 32: 64.0
- Accuracy 32: 0.3125
- Gen Accuracy 32: 0.1875
- Correct Gen Preds 33: 23.0
- Correct Preds 33: 24.0
- Total Labels 33: 73.0
- Accuracy 33: 0.3288
- Gen Accuracy 33: 0.3151
- Correct Gen Preds 34: 42.0
- Correct Preds 34: 47.0
- Total Labels 34: 78.0
- Accuracy 34: 0.6026
- Gen Accuracy 34: 0.5385
- Correct Gen Preds 35: 37.0
- Correct Preds 35: 41.0
- Total Labels 35: 83.0
- Accuracy 35: 0.4940
- Gen Accuracy 35: 0.4458
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 1.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.6389 | 0.0057 | 706.9523 | 490.0220 | 66.0 | 299.0 | 0.2207 | 66.0 | 0.2207 | 62.0 | 62.0 | 64.0 | 0.9688 | 0.9688 | 0.0 | 0.0 | 73.0 | 0.0 | 0.0 | 4.0 | 4.0 | 78.0 | 0.0513 | 0.0513 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.4865 | 1.0 | 6 | 1.4608 | 0.0057 | 630.1492 | 436.7861 | 79.0 | 299.0 | 0.2642 | 77.0 | 0.2575 | 56.0 | 57.0 | 64.0 | 0.8906 | 0.875 | 11.0 | 12.0 | 73.0 | 0.1644 | 0.1507 | 1.0 | 1.0 | 78.0 | 0.0128 | 0.0128 | 9.0 | 9.0 | 83.0 | 0.1084 | 0.1084 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.0972 | 2.0 | 12 | 1.4324 | 0.0057 | 617.8780 | 428.2804 | 120.0 | 299.0 | 0.4013 | 119.0 | 0.3980 | 36.0 | 36.0 | 64.0 | 0.5625 | 0.5625 | 24.0 | 24.0 | 73.0 | 0.3288 | 0.3288 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 34.0 | 35.0 | 83.0 | 0.4217 | 0.4096 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.4836 | 3.0 | 18 | 1.5531 | 0.0057 | 669.9700 | 464.3878 | 132.0 | 299.0 | 0.4415 | 114.0 | 0.3813 | 12.0 | 20.0 | 64.0 | 0.3125 | 0.1875 | 23.0 | 24.0 | 73.0 | 0.3288 | 0.3151 | 42.0 | 47.0 | 78.0 | 0.6026 | 0.5385 | 37.0 | 41.0 | 83.0 | 0.4940 | 0.4458 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.1277 | 4.0 | 24 | 2.8384 | 0.0057 | 1224.4029 | 848.6914 | 125.0 | 299.0 | 0.4181 | 116.0 | 0.3880 | 18.0 | 23.0 | 64.0 | 0.3594 | 0.2812 | 28.0 | 30.0 | 73.0 | 0.4110 | 0.3836 | 35.0 | 35.0 | 78.0 | 0.4487 | 0.4487 | 35.0 | 37.0 | 83.0 | 0.4458 | 0.4217 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0009 | 5.0 | 30 | 4.5527 | 0.0057 | 1963.8919 | 1361.2661 | 121.0 | 299.0 | 0.4047 | 112.0 | 0.3746 | 19.0 | 24.0 | 64.0 | 0.375 | 0.2969 | 32.0 | 34.0 | 73.0 | 0.4658 | 0.4384 | 35.0 | 35.0 | 78.0 | 0.4487 | 0.4487 | 26.0 | 28.0 | 83.0 | 0.3373 | 0.3133 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0005 | 6.0 | 36 | 7.0651 | 0.0057 | 3047.6390 | 2112.4624 | 120.0 | 299.0 | 0.4013 | 115.0 | 0.3846 | 25.0 | 29.0 | 64.0 | 0.4531 | 0.3906 | 33.0 | 33.0 | 73.0 | 0.4521 | 0.4521 | 32.0 | 32.0 | 78.0 | 0.4103 | 0.4103 | 25.0 | 26.0 | 83.0 | 0.3133 | 0.3012 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 7.0 | 42 | 7.6423 | 0.0057 | 3296.6319 | 2285.0511 | 127.0 | 299.0 | 0.4247 | 118.0 | 0.3946 | 26.0 | 29.0 | 64.0 | 0.4531 | 0.4062 | 34.0 | 34.0 | 73.0 | 0.4658 | 0.4658 | 28.0 | 31.0 | 78.0 | 0.3974 | 0.3590 | 30.0 | 33.0 | 83.0 | 0.3976 | 0.3614 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 8.0 | 48 | 7.8574 | 0.0057 | 3389.4254 | 2349.3707 | 127.0 | 299.0 | 0.4247 | 117.0 | 0.3913 | 19.0 | 23.0 | 64.0 | 0.3594 | 0.2969 | 34.0 | 36.0 | 73.0 | 0.4932 | 0.4658 | 30.0 | 31.0 | 78.0 | 0.3974 | 0.3846 | 34.0 | 36.0 | 83.0 | 0.4337 | 0.4096 | 0.0 | 1.0 | 1.0 | 1.0 | 0.0 |
| 0.0 | 9.0 | 54 | 7.5145 | 0.0057 | 3241.4943 | 2246.8326 | 126.0 | 299.0 | 0.4214 | 111.0 | 0.3712 | 18.0 | 25.0 | 64.0 | 0.3906 | 0.2812 | 36.0 | 38.0 | 73.0 | 0.5205 | 0.4932 | 28.0 | 30.0 | 78.0 | 0.3846 | 0.3590 | 29.0 | 32.0 | 83.0 | 0.3855 | 0.3494 | 0.0 | 1.0 | 1.0 | 1.0 | 0.0 |
| 0.0 | 10.0 | 60 | 7.0254 | 0.0057 | 3030.5166 | 2100.5941 | 127.0 | 299.0 | 0.4247 | 98.0 | 0.3278 | 14.0 | 30.0 | 64.0 | 0.4688 | 0.2188 | 32.0 | 37.0 | 73.0 | 0.5068 | 0.4384 | 28.0 | 33.0 | 78.0 | 0.4231 | 0.3590 | 24.0 | 27.0 | 83.0 | 0.3253 | 0.2892 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 66 | 7.2467 | 0.0057 | 3125.9883 | 2166.7700 | 127.0 | 299.0 | 0.4247 | 113.0 | 0.3779 | 17.0 | 26.0 | 64.0 | 0.4062 | 0.2656 | 37.0 | 39.0 | 73.0 | 0.5342 | 0.5068 | 34.0 | 36.0 | 78.0 | 0.4615 | 0.4359 | 25.0 | 26.0 | 83.0 | 0.3133 | 0.3012 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 72 | 7.4089 | 0.0057 | 3195.9474 | 2215.2619 | 123.0 | 299.0 | 0.4114 | 116.0 | 0.3880 | 20.0 | 24.0 | 64.0 | 0.375 | 0.3125 | 35.0 | 37.0 | 73.0 | 0.5068 | 0.4795 | 33.0 | 34.0 | 78.0 | 0.4359 | 0.4231 | 28.0 | 28.0 | 83.0 | 0.3373 | 0.3373 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 78 | 7.2529 | 0.0057 | 3128.6606 | 2168.6223 | 126.0 | 299.0 | 0.4214 | 121.0 | 0.4047 | 21.0 | 24.0 | 64.0 | 0.375 | 0.3281 | 37.0 | 38.0 | 73.0 | 0.5205 | 0.5068 | 37.0 | 38.0 | 78.0 | 0.4872 | 0.4744 | 26.0 | 26.0 | 83.0 | 0.3133 | 0.3133 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 84 | 7.2595 | 0.0057 | 3131.5101 | 2170.5974 | 127.0 | 299.0 | 0.4247 | 120.0 | 0.4013 | 21.0 | 24.0 | 64.0 | 0.375 | 0.3281 | 38.0 | 40.0 | 73.0 | 0.5479 | 0.5205 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 26.0 | 26.0 | 83.0 | 0.3133 | 0.3133 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 90 | 7.2704 | 0.0057 | 3136.2190 | 2173.8614 | 125.0 | 299.0 | 0.4181 | 118.0 | 0.3946 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 39.0 | 73.0 | 0.5342 | 0.5068 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 27.0 | 27.0 | 83.0 | 0.3253 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 96 | 7.2516 | 0.0057 | 3128.0812 | 2168.2206 | 124.0 | 299.0 | 0.4147 | 118.0 | 0.3946 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 38.0 | 73.0 | 0.5205 | 0.5068 | 35.0 | 36.0 | 78.0 | 0.4615 | 0.4487 | 27.0 | 28.0 | 83.0 | 0.3373 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 102 | 7.2158 | 0.0057 | 3112.6486 | 2157.5236 | 123.0 | 299.0 | 0.4114 | 118.0 | 0.3946 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 37.0 | 73.0 | 0.5068 | 0.5068 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 27.0 | 27.0 | 83.0 | 0.3253 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 108 | 7.2719 | 0.0057 | 3136.8490 | 2174.2980 | 124.0 | 299.0 | 0.4147 | 117.0 | 0.3913 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 36.0 | 37.0 | 73.0 | 0.5068 | 0.4932 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 27.0 | 28.0 | 83.0 | 0.3373 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 114 | 7.3003 | 0.0057 | 3149.1089 | 2182.7960 | 123.0 | 299.0 | 0.4114 | 118.0 | 0.3946 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 37.0 | 73.0 | 0.5068 | 0.5068 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 27.0 | 27.0 | 83.0 | 0.3253 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 120 | 7.2762 | 0.0057 | 3138.7201 | 2175.5950 | 125.0 | 299.0 | 0.4181 | 118.0 | 0.3946 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 38.0 | 73.0 | 0.5205 | 0.5068 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 27.0 | 28.0 | 83.0 | 0.3373 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 126 | 7.2258 | 0.0057 | 3116.9751 | 2160.5225 | 123.0 | 299.0 | 0.4114 | 119.0 | 0.3980 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 38.0 | 73.0 | 0.5205 | 0.5068 | 35.0 | 35.0 | 78.0 | 0.4487 | 0.4487 | 28.0 | 28.0 | 83.0 | 0.3373 | 0.3373 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 132 | 7.2761 | 0.0057 | 3138.6438 | 2175.5421 | 123.0 | 299.0 | 0.4114 | 118.0 | 0.3946 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 38.0 | 73.0 | 0.5205 | 0.5068 | 35.0 | 36.0 | 78.0 | 0.4615 | 0.4487 | 27.0 | 27.0 | 83.0 | 0.3253 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 138 | 7.2421 | 0.0057 | 3123.9891 | 2165.3842 | 125.0 | 299.0 | 0.4181 | 119.0 | 0.3980 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 38.0 | 39.0 | 73.0 | 0.5342 | 0.5205 | 35.0 | 36.0 | 78.0 | 0.4615 | 0.4487 | 27.0 | 28.0 | 83.0 | 0.3373 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 144 | 7.2574 | 0.0057 | 3130.5871 | 2169.9576 | 125.0 | 299.0 | 0.4181 | 118.0 | 0.3946 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 36.0 | 38.0 | 73.0 | 0.5205 | 0.4932 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 28.0 | 28.0 | 83.0 | 0.3373 | 0.3373 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 150 | 7.2338 | 0.0057 | 3120.4105 | 2162.9037 | 123.0 | 299.0 | 0.4114 | 117.0 | 0.3913 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 36.0 | 38.0 | 73.0 | 0.5205 | 0.4932 | 34.0 | 35.0 | 78.0 | 0.4487 | 0.4359 | 28.0 | 28.0 | 83.0 | 0.3373 | 0.3373 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 156 | 7.2551 | 0.0057 | 3129.6043 | 2169.2764 | 123.0 | 299.0 | 0.4114 | 118.0 | 0.3946 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 38.0 | 73.0 | 0.5205 | 0.5068 | 35.0 | 36.0 | 78.0 | 0.4615 | 0.4487 | 27.0 | 27.0 | 83.0 | 0.3253 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 162 | 7.2376 | 0.0057 | 3122.0548 | 2164.0435 | 125.0 | 299.0 | 0.4181 | 117.0 | 0.3913 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 36.0 | 38.0 | 73.0 | 0.5205 | 0.4932 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 27.0 | 28.0 | 83.0 | 0.3373 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 168 | 7.2457 | 0.0057 | 3125.5609 | 2166.4737 | 125.0 | 299.0 | 0.4181 | 118.0 | 0.3946 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 36.0 | 38.0 | 73.0 | 0.5205 | 0.4932 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 28.0 | 28.0 | 83.0 | 0.3373 | 0.3373 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 174 | 7.2451 | 0.0057 | 3125.2939 | 2166.2886 | 125.0 | 299.0 | 0.4181 | 119.0 | 0.3980 | 20.0 | 24.0 | 64.0 | 0.375 | 0.3125 | 37.0 | 37.0 | 73.0 | 0.5068 | 0.5068 | 35.0 | 37.0 | 78.0 | 0.4744 | 0.4487 | 27.0 | 27.0 | 83.0 | 0.3253 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 180 | 7.2520 | 0.0057 | 3128.2546 | 2168.3409 | 124.0 | 299.0 | 0.4147 | 118.0 | 0.3946 | 20.0 | 23.0 | 64.0 | 0.3594 | 0.3125 | 36.0 | 37.0 | 73.0 | 0.5068 | 0.4932 | 35.0 | 36.0 | 78.0 | 0.4615 | 0.4487 | 27.0 | 28.0 | 83.0 | 0.3373 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 186 | 7.2720 | 0.0057 | 3136.8961 | 2174.3307 | 124.0 | 299.0 | 0.4147 | 117.0 | 0.3913 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 38.0 | 73.0 | 0.5205 | 0.5068 | 34.0 | 36.0 | 78.0 | 0.4615 | 0.4359 | 27.0 | 28.0 | 83.0 | 0.3373 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 192 | 7.2555 | 0.0057 | 3129.7943 | 2169.4081 | 123.0 | 299.0 | 0.4114 | 117.0 | 0.3913 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 36.0 | 37.0 | 73.0 | 0.5068 | 0.4932 | 35.0 | 36.0 | 78.0 | 0.4615 | 0.4487 | 27.0 | 28.0 | 83.0 | 0.3373 | 0.3253 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 198 | 7.2496 | 0.0057 | 3127.2402 | 2167.6377 | 124.0 | 299.0 | 0.4147 | 119.0 | 0.3980 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 37.0 | 38.0 | 73.0 | 0.5205 | 0.5068 | 35.0 | 36.0 | 78.0 | 0.4615 | 0.4487 | 28.0 | 28.0 | 83.0 | 0.3373 | 0.3373 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Challenge_Llama-3.2-1B-zfgomj43
Base model
meta-llama/Llama-3.2-1B