ARC-Challenge_Llama-3.2-1B-hifbxhfd
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 4.9794
- Model Preparation Time: 0.0063
- Mdl: 2147.9346
- Accumulated Loss: 1488.8348
- Correct Preds: 120.0
- Total Preds: 299.0
- Accuracy: 0.4013
- Correct Gen Preds: 116.0
- Gen Accuracy: 0.3880
- Correct Gen Preds 32: 15.0
- Correct Preds 32: 15.0
- Total Labels 32: 64.0
- Accuracy 32: 0.2344
- Gen Accuracy 32: 0.2344
- Correct Gen Preds 33: 41.0
- Correct Preds 33: 43.0
- Total Labels 33: 73.0
- Accuracy 33: 0.5890
- Gen Accuracy 33: 0.5616
- Correct Gen Preds 34: 31.0
- Correct Preds 34: 31.0
- Total Labels 34: 78.0
- Accuracy 34: 0.3974
- Gen Accuracy 34: 0.3974
- Correct Gen Preds 35: 29.0
- Correct Preds 35: 31.0
- Total Labels 35: 83.0
- Accuracy 35: 0.3735
- Gen Accuracy 35: 0.3494
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 1.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.6389 | 0.0063 | 706.9523 | 490.0220 | 66.0 | 299.0 | 0.2207 | 66.0 | 0.2207 | 62.0 | 62.0 | 64.0 | 0.9688 | 0.9688 | 0.0 | 0.0 | 73.0 | 0.0 | 0.0 | 4.0 | 4.0 | 78.0 | 0.0513 | 0.0513 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.4701 | 1.0 | 3 | 1.5924 | 0.0063 | 686.8892 | 476.1153 | 76.0 | 299.0 | 0.2542 | 75.0 | 0.2508 | 7.0 | 7.0 | 64.0 | 0.1094 | 0.1094 | 68.0 | 69.0 | 73.0 | 0.9452 | 0.9315 | 0.0 | 0.0 | 78.0 | 0.0 | 0.0 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.4771 | 2.0 | 6 | 1.4077 | 0.0063 | 607.2453 | 420.9104 | 77.0 | 299.0 | 0.2575 | 74.0 | 0.2475 | 36.0 | 37.0 | 64.0 | 0.5781 | 0.5625 | 7.0 | 7.0 | 73.0 | 0.0959 | 0.0959 | 0.0 | 0.0 | 78.0 | 0.0 | 0.0 | 31.0 | 33.0 | 83.0 | 0.3976 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.0843 | 3.0 | 9 | 1.4049 | 0.0063 | 606.0456 | 420.0788 | 99.0 | 299.0 | 0.3311 | 98.0 | 0.3278 | 24.0 | 25.0 | 64.0 | 0.3906 | 0.375 | 40.0 | 40.0 | 73.0 | 0.5479 | 0.5479 | 21.0 | 21.0 | 78.0 | 0.2692 | 0.2692 | 13.0 | 13.0 | 83.0 | 0.1566 | 0.1566 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.4069 | 4.0 | 12 | 2.1331 | 0.0063 | 920.1642 | 637.8092 | 114.0 | 299.0 | 0.3813 | 105.0 | 0.3512 | 24.0 | 26.0 | 64.0 | 0.4062 | 0.375 | 24.0 | 26.0 | 73.0 | 0.3562 | 0.3288 | 22.0 | 25.0 | 78.0 | 0.3205 | 0.2821 | 35.0 | 37.0 | 83.0 | 0.4458 | 0.4217 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0958 | 5.0 | 15 | 3.1166 | 0.0063 | 1344.3795 | 931.8528 | 106.0 | 299.0 | 0.3545 | 91.0 | 0.3043 | 11.0 | 15.0 | 64.0 | 0.2344 | 0.1719 | 28.0 | 30.0 | 73.0 | 0.4110 | 0.3836 | 18.0 | 22.0 | 78.0 | 0.2821 | 0.2308 | 34.0 | 39.0 | 83.0 | 0.4699 | 0.4096 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0596 | 6.0 | 18 | 4.9794 | 0.0063 | 2147.9346 | 1488.8348 | 120.0 | 299.0 | 0.4013 | 116.0 | 0.3880 | 15.0 | 15.0 | 64.0 | 0.2344 | 0.2344 | 41.0 | 43.0 | 73.0 | 0.5890 | 0.5616 | 31.0 | 31.0 | 78.0 | 0.3974 | 0.3974 | 29.0 | 31.0 | 83.0 | 0.3735 | 0.3494 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0006 | 7.0 | 21 | 5.7281 | 0.0063 | 2470.9158 | 1712.7083 | 118.0 | 299.0 | 0.3946 | 116.0 | 0.3880 | 20.0 | 20.0 | 64.0 | 0.3125 | 0.3125 | 35.0 | 36.0 | 73.0 | 0.4932 | 0.4795 | 31.0 | 31.0 | 78.0 | 0.3974 | 0.3974 | 30.0 | 31.0 | 83.0 | 0.3735 | 0.3614 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0001 | 8.0 | 24 | 6.5268 | 0.0063 | 2815.4597 | 1951.5280 | 116.0 | 299.0 | 0.3880 | 115.0 | 0.3846 | 25.0 | 25.0 | 64.0 | 0.3906 | 0.3906 | 31.0 | 31.0 | 73.0 | 0.4247 | 0.4247 | 28.0 | 28.0 | 78.0 | 0.3590 | 0.3590 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 27 | 6.9695 | 0.0063 | 3006.4105 | 2083.8849 | 113.0 | 299.0 | 0.3779 | 112.0 | 0.3746 | 27.0 | 27.0 | 64.0 | 0.4219 | 0.4219 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 30 | 7.2697 | 0.0063 | 3135.8946 | 2173.6365 | 113.0 | 299.0 | 0.3779 | 112.0 | 0.3746 | 28.0 | 28.0 | 64.0 | 0.4375 | 0.4375 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 33 | 7.4778 | 0.0063 | 3225.6685 | 2235.8630 | 113.0 | 299.0 | 0.3779 | 112.0 | 0.3746 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 36 | 7.5700 | 0.0063 | 3265.4444 | 2263.4336 | 113.0 | 299.0 | 0.3779 | 113.0 | 0.3779 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 39 | 7.6337 | 0.0063 | 3292.8992 | 2282.4638 | 115.0 | 299.0 | 0.3846 | 115.0 | 0.3846 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 42 | 7.7177 | 0.0063 | 3329.1652 | 2307.6014 | 115.0 | 299.0 | 0.3846 | 114.0 | 0.3813 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 45 | 7.7144 | 0.0063 | 3327.7168 | 2306.5975 | 113.0 | 299.0 | 0.3779 | 112.0 | 0.3746 | 28.0 | 28.0 | 64.0 | 0.4375 | 0.4375 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 48 | 7.7456 | 0.0063 | 3341.2047 | 2315.9466 | 114.0 | 299.0 | 0.3813 | 114.0 | 0.3813 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 51 | 7.7246 | 0.0063 | 3332.1346 | 2309.6597 | 113.0 | 299.0 | 0.3779 | 112.0 | 0.3746 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 54 | 7.7910 | 0.0063 | 3360.7844 | 2329.5182 | 113.0 | 299.0 | 0.3779 | 112.0 | 0.3746 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 24.0 | 24.0 | 78.0 | 0.3077 | 0.3077 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 57 | 7.8124 | 0.0063 | 3369.9946 | 2335.9022 | 114.0 | 299.0 | 0.3813 | 114.0 | 0.3813 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 60 | 7.7234 | 0.0063 | 3331.5996 | 2309.2889 | 115.0 | 299.0 | 0.3846 | 115.0 | 0.3846 | 30.0 | 30.0 | 64.0 | 0.4688 | 0.4688 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 63 | 7.7679 | 0.0063 | 3350.8072 | 2322.6025 | 115.0 | 299.0 | 0.3846 | 115.0 | 0.3846 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 66 | 7.7526 | 0.0063 | 3344.1986 | 2318.0219 | 115.0 | 299.0 | 0.3846 | 115.0 | 0.3846 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 69 | 7.7910 | 0.0063 | 3360.7764 | 2329.5127 | 112.0 | 299.0 | 0.3746 | 112.0 | 0.3746 | 28.0 | 28.0 | 64.0 | 0.4375 | 0.4375 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 72 | 7.7183 | 0.0063 | 3329.3986 | 2307.7632 | 115.0 | 299.0 | 0.3846 | 115.0 | 0.3846 | 30.0 | 30.0 | 64.0 | 0.4688 | 0.4688 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 75 | 7.7304 | 0.0063 | 3334.6225 | 2311.3842 | 114.0 | 299.0 | 0.3813 | 114.0 | 0.3813 | 30.0 | 30.0 | 64.0 | 0.4688 | 0.4688 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 78 | 7.7551 | 0.0063 | 3345.2652 | 2318.7612 | 114.0 | 299.0 | 0.3813 | 113.0 | 0.3779 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 81 | 7.7737 | 0.0063 | 3353.3180 | 2324.3429 | 116.0 | 299.0 | 0.3880 | 115.0 | 0.3846 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 84 | 7.7507 | 0.0063 | 3343.3752 | 2317.4511 | 116.0 | 299.0 | 0.3880 | 115.0 | 0.3846 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 87 | 7.7632 | 0.0063 | 3348.7890 | 2321.2037 | 116.0 | 299.0 | 0.3880 | 115.0 | 0.3846 | 30.0 | 30.0 | 64.0 | 0.4688 | 0.4688 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 90 | 7.7401 | 0.0063 | 3338.8227 | 2314.2955 | 116.0 | 299.0 | 0.3880 | 115.0 | 0.3846 | 30.0 | 30.0 | 64.0 | 0.4688 | 0.4688 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 93 | 7.7578 | 0.0063 | 3346.4502 | 2319.5825 | 113.0 | 299.0 | 0.3779 | 113.0 | 0.3779 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 96 | 7.7958 | 0.0063 | 3362.8384 | 2330.9420 | 112.0 | 299.0 | 0.3746 | 112.0 | 0.3746 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 99 | 7.7635 | 0.0063 | 3348.8919 | 2321.2750 | 114.0 | 299.0 | 0.3813 | 114.0 | 0.3813 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 34.0 | 102 | 7.7796 | 0.0063 | 3355.8332 | 2326.0863 | 114.0 | 299.0 | 0.3813 | 113.0 | 0.3779 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 32.0 | 83.0 | 0.3855 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 35.0 | 105 | 7.7913 | 0.0063 | 3360.8969 | 2329.5962 | 113.0 | 299.0 | 0.3779 | 113.0 | 0.3779 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 25.0 | 25.0 | 78.0 | 0.3205 | 0.3205 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 36.0 | 108 | 7.7978 | 0.0063 | 3363.7162 | 2331.5504 | 112.0 | 299.0 | 0.3746 | 112.0 | 0.3746 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 24.0 | 24.0 | 78.0 | 0.3077 | 0.3077 | 31.0 | 31.0 | 83.0 | 0.3735 | 0.3735 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Challenge_Llama-3.2-1B-hifbxhfd
Base model
meta-llama/Llama-3.2-1B