ARC-Easy_Llama-3.2-1B-qba6fe5a
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.1998
- Model Preparation Time: 0.006
- Mdl: 1808.9895
- Accumulated Loss: 1253.8960
- Correct Preds: 346.0
- Total Preds: 570.0
- Accuracy: 0.6070
- Correct Gen Preds: 337.0
- Gen Accuracy: 0.5912
- Correct Gen Preds 32: 123.0
- Correct Preds 32: 131.0
- Total Labels 32: 158.0
- Accuracy 32: 0.8291
- Gen Accuracy 32: 0.7785
- Correct Gen Preds 33: 106.0
- Correct Preds 33: 106.0
- Total Labels 33: 152.0
- Accuracy 33: 0.6974
- Gen Accuracy 33: 0.6974
- Correct Gen Preds 34: 74.0
- Correct Preds 34: 75.0
- Total Labels 34: 142.0
- Accuracy 34: 0.5282
- Gen Accuracy 34: 0.5211
- Correct Gen Preds 35: 34.0
- Correct Preds 35: 34.0
- Total Labels 35: 118.0
- Accuracy 35: 0.2881
- Gen Accuracy 35: 0.2881
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.006 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.4642 | 1.0 | 1 | 1.5354 | 0.006 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.4642 | 2.0 | 2 | 2.4299 | 0.006 | 1998.1608 | 1385.0195 | 210.0 | 570.0 | 0.3684 | 210.0 | 0.3684 | 0.0 | 0.0 | 158.0 | 0.0 | 0.0 | 144.0 | 144.0 | 152.0 | 0.9474 | 0.9474 | 66.0 | 66.0 | 142.0 | 0.4648 | 0.4648 | 0.0 | 0.0 | 118.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.7757 | 3.0 | 3 | 1.2974 | 0.006 | 1066.9296 | 739.5393 | 185.0 | 570.0 | 0.3246 | 185.0 | 0.3246 | 6.0 | 6.0 | 158.0 | 0.0380 | 0.0380 | 152.0 | 152.0 | 152.0 | 1.0 | 1.0 | 27.0 | 27.0 | 142.0 | 0.1901 | 0.1901 | 0.0 | 0.0 | 118.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.6892 | 4.0 | 4 | 2.0158 | 0.006 | 1657.6402 | 1148.9886 | 279.0 | 570.0 | 0.4895 | 279.0 | 0.4895 | 148.0 | 148.0 | 158.0 | 0.9367 | 0.9367 | 48.0 | 48.0 | 152.0 | 0.3158 | 0.3158 | 57.0 | 57.0 | 142.0 | 0.4014 | 0.4014 | 26.0 | 26.0 | 118.0 | 0.2203 | 0.2203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.1661 | 5.0 | 5 | 2.1998 | 0.006 | 1808.9895 | 1253.8960 | 346.0 | 570.0 | 0.6070 | 337.0 | 0.5912 | 123.0 | 131.0 | 158.0 | 0.8291 | 0.7785 | 106.0 | 106.0 | 152.0 | 0.6974 | 0.6974 | 74.0 | 75.0 | 142.0 | 0.5282 | 0.5211 | 34.0 | 34.0 | 118.0 | 0.2881 | 0.2881 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0079 | 6.0 | 6 | 2.8282 | 0.006 | 2325.6988 | 1612.0516 | 343.0 | 570.0 | 0.6018 | 296.0 | 0.5193 | 84.0 | 123.0 | 158.0 | 0.7785 | 0.5316 | 105.0 | 109.0 | 152.0 | 0.7171 | 0.6908 | 72.0 | 76.0 | 142.0 | 0.5352 | 0.5070 | 35.0 | 35.0 | 118.0 | 0.2966 | 0.2966 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 7.0 | 7 | 3.1565 | 0.006 | 2595.6829 | 1799.1903 | 339.0 | 570.0 | 0.5947 | 264.0 | 0.4632 | 60.0 | 117.0 | 158.0 | 0.7405 | 0.3797 | 104.0 | 111.0 | 152.0 | 0.7303 | 0.6842 | 69.0 | 76.0 | 142.0 | 0.5352 | 0.4859 | 31.0 | 35.0 | 118.0 | 0.2966 | 0.2627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 8.0 | 8 | 3.3429 | 0.006 | 2749.0232 | 1905.4777 | 331.0 | 570.0 | 0.5807 | 236.0 | 0.4140 | 40.0 | 112.0 | 158.0 | 0.7089 | 0.2532 | 101.0 | 110.0 | 152.0 | 0.7237 | 0.6645 | 68.0 | 77.0 | 142.0 | 0.5423 | 0.4789 | 27.0 | 32.0 | 118.0 | 0.2712 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 9 | 3.5286 | 0.006 | 2901.6844 | 2011.2944 | 327.0 | 570.0 | 0.5737 | 228.0 | 0.4 | 41.0 | 110.0 | 158.0 | 0.6962 | 0.2595 | 99.0 | 111.0 | 152.0 | 0.7303 | 0.6513 | 61.0 | 74.0 | 142.0 | 0.5211 | 0.4296 | 27.0 | 32.0 | 118.0 | 0.2712 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 10 | 3.6900 | 0.006 | 3034.4363 | 2103.3110 | 323.0 | 570.0 | 0.5667 | 227.0 | 0.3982 | 41.0 | 111.0 | 158.0 | 0.7025 | 0.2595 | 97.0 | 107.0 | 152.0 | 0.7039 | 0.6382 | 62.0 | 73.0 | 142.0 | 0.5141 | 0.4366 | 27.0 | 32.0 | 118.0 | 0.2712 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 11 | 3.7945 | 0.006 | 3120.3216 | 2162.8421 | 323.0 | 570.0 | 0.5667 | 230.0 | 0.4035 | 43.0 | 112.0 | 158.0 | 0.7089 | 0.2722 | 98.0 | 107.0 | 152.0 | 0.7039 | 0.6447 | 63.0 | 73.0 | 142.0 | 0.5141 | 0.4437 | 26.0 | 31.0 | 118.0 | 0.2627 | 0.2203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 12 | 3.8860 | 0.006 | 3195.5829 | 2215.0093 | 321.0 | 570.0 | 0.5632 | 227.0 | 0.3982 | 46.0 | 111.0 | 158.0 | 0.7025 | 0.2911 | 94.0 | 105.0 | 152.0 | 0.6908 | 0.6184 | 62.0 | 74.0 | 142.0 | 0.5211 | 0.4366 | 25.0 | 31.0 | 118.0 | 0.2627 | 0.2119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 13 | 3.9627 | 0.006 | 3258.6448 | 2258.7204 | 321.0 | 570.0 | 0.5632 | 226.0 | 0.3965 | 45.0 | 110.0 | 158.0 | 0.6962 | 0.2848 | 94.0 | 106.0 | 152.0 | 0.6974 | 0.6184 | 62.0 | 74.0 | 142.0 | 0.5211 | 0.4366 | 25.0 | 31.0 | 118.0 | 0.2627 | 0.2119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 14 | 4.0387 | 0.006 | 3321.1484 | 2302.0447 | 319.0 | 570.0 | 0.5596 | 227.0 | 0.3982 | 48.0 | 109.0 | 158.0 | 0.6899 | 0.3038 | 93.0 | 105.0 | 152.0 | 0.6908 | 0.6118 | 61.0 | 74.0 | 142.0 | 0.5211 | 0.4296 | 25.0 | 31.0 | 118.0 | 0.2627 | 0.2119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 15 | 4.0577 | 0.006 | 3336.7945 | 2312.8897 | 319.0 | 570.0 | 0.5596 | 226.0 | 0.3965 | 48.0 | 109.0 | 158.0 | 0.6899 | 0.3038 | 91.0 | 106.0 | 152.0 | 0.6974 | 0.5987 | 60.0 | 74.0 | 142.0 | 0.5211 | 0.4225 | 27.0 | 30.0 | 118.0 | 0.2542 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 16 | 4.0975 | 0.006 | 3369.4997 | 2335.5592 | 317.0 | 570.0 | 0.5561 | 224.0 | 0.3930 | 50.0 | 109.0 | 158.0 | 0.6899 | 0.3165 | 88.0 | 104.0 | 152.0 | 0.6842 | 0.5789 | 60.0 | 73.0 | 142.0 | 0.5141 | 0.4225 | 26.0 | 31.0 | 118.0 | 0.2627 | 0.2203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 17 | 4.1230 | 0.006 | 3390.5230 | 2350.1314 | 316.0 | 570.0 | 0.5544 | 229.0 | 0.4018 | 51.0 | 108.0 | 158.0 | 0.6835 | 0.3228 | 91.0 | 104.0 | 152.0 | 0.6842 | 0.5987 | 60.0 | 74.0 | 142.0 | 0.5211 | 0.4225 | 27.0 | 30.0 | 118.0 | 0.2542 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 18 | 4.1552 | 0.006 | 3416.9873 | 2368.4751 | 318.0 | 570.0 | 0.5579 | 229.0 | 0.4018 | 51.0 | 108.0 | 158.0 | 0.6835 | 0.3228 | 89.0 | 103.0 | 152.0 | 0.6776 | 0.5855 | 62.0 | 76.0 | 142.0 | 0.5352 | 0.4366 | 27.0 | 31.0 | 118.0 | 0.2627 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 19 | 4.1977 | 0.006 | 3451.8923 | 2392.6694 | 316.0 | 570.0 | 0.5544 | 227.0 | 0.3982 | 50.0 | 108.0 | 158.0 | 0.6835 | 0.3165 | 89.0 | 103.0 | 152.0 | 0.6776 | 0.5855 | 62.0 | 75.0 | 142.0 | 0.5282 | 0.4366 | 26.0 | 30.0 | 118.0 | 0.2542 | 0.2203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 20 | 4.1922 | 0.006 | 3447.4190 | 2389.5688 | 317.0 | 570.0 | 0.5561 | 228.0 | 0.4 | 51.0 | 109.0 | 158.0 | 0.6899 | 0.3228 | 89.0 | 104.0 | 152.0 | 0.6842 | 0.5855 | 60.0 | 74.0 | 142.0 | 0.5211 | 0.4225 | 28.0 | 30.0 | 118.0 | 0.2542 | 0.2373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 21 | 4.2154 | 0.006 | 3466.4538 | 2402.7627 | 317.0 | 570.0 | 0.5561 | 231.0 | 0.4053 | 53.0 | 109.0 | 158.0 | 0.6899 | 0.3354 | 89.0 | 102.0 | 152.0 | 0.6711 | 0.5855 | 62.0 | 76.0 | 142.0 | 0.5352 | 0.4366 | 27.0 | 30.0 | 118.0 | 0.2542 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 22 | 4.2255 | 0.006 | 3474.8213 | 2408.5626 | 319.0 | 570.0 | 0.5596 | 231.0 | 0.4053 | 51.0 | 108.0 | 158.0 | 0.6835 | 0.3228 | 90.0 | 103.0 | 152.0 | 0.6776 | 0.5921 | 63.0 | 78.0 | 142.0 | 0.5493 | 0.4437 | 27.0 | 30.0 | 118.0 | 0.2542 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 23 | 4.2222 | 0.006 | 3472.0563 | 2406.6461 | 323.0 | 570.0 | 0.5667 | 234.0 | 0.4105 | 53.0 | 111.0 | 158.0 | 0.7025 | 0.3354 | 89.0 | 104.0 | 152.0 | 0.6842 | 0.5855 | 64.0 | 77.0 | 142.0 | 0.5423 | 0.4507 | 28.0 | 31.0 | 118.0 | 0.2627 | 0.2373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 24 | 4.2449 | 0.006 | 3490.7282 | 2419.5884 | 318.0 | 570.0 | 0.5579 | 233.0 | 0.4088 | 53.0 | 108.0 | 158.0 | 0.6835 | 0.3354 | 89.0 | 103.0 | 152.0 | 0.6776 | 0.5855 | 63.0 | 76.0 | 142.0 | 0.5352 | 0.4437 | 28.0 | 31.0 | 118.0 | 0.2627 | 0.2373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 25 | 4.2439 | 0.006 | 3489.9021 | 2419.0158 | 317.0 | 570.0 | 0.5561 | 234.0 | 0.4105 | 53.0 | 107.0 | 158.0 | 0.6772 | 0.3354 | 89.0 | 103.0 | 152.0 | 0.6776 | 0.5855 | 64.0 | 76.0 | 142.0 | 0.5352 | 0.4507 | 28.0 | 31.0 | 118.0 | 0.2627 | 0.2373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 26 | 4.2465 | 0.006 | 3492.0437 | 2420.5002 | 316.0 | 570.0 | 0.5544 | 233.0 | 0.4088 | 55.0 | 109.0 | 158.0 | 0.6899 | 0.3481 | 89.0 | 101.0 | 152.0 | 0.6645 | 0.5855 | 62.0 | 76.0 | 142.0 | 0.5352 | 0.4366 | 27.0 | 30.0 | 118.0 | 0.2542 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 27 | 4.2626 | 0.006 | 3505.3292 | 2429.7091 | 317.0 | 570.0 | 0.5561 | 233.0 | 0.4088 | 54.0 | 109.0 | 158.0 | 0.6899 | 0.3418 | 88.0 | 102.0 | 152.0 | 0.6711 | 0.5789 | 62.0 | 75.0 | 142.0 | 0.5282 | 0.4366 | 29.0 | 31.0 | 118.0 | 0.2627 | 0.2458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 28 | 4.2468 | 0.006 | 3492.3048 | 2420.6812 | 320.0 | 570.0 | 0.5614 | 234.0 | 0.4105 | 53.0 | 108.0 | 158.0 | 0.6835 | 0.3354 | 89.0 | 103.0 | 152.0 | 0.6776 | 0.5855 | 63.0 | 78.0 | 142.0 | 0.5493 | 0.4437 | 29.0 | 31.0 | 118.0 | 0.2627 | 0.2458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 29 | 4.2713 | 0.006 | 3512.4807 | 2434.6661 | 318.0 | 570.0 | 0.5579 | 233.0 | 0.4088 | 54.0 | 109.0 | 158.0 | 0.6899 | 0.3418 | 89.0 | 102.0 | 152.0 | 0.6711 | 0.5855 | 62.0 | 76.0 | 142.0 | 0.5352 | 0.4366 | 28.0 | 31.0 | 118.0 | 0.2627 | 0.2373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 30 | 4.2732 | 0.006 | 3513.9739 | 2435.7011 | 317.0 | 570.0 | 0.5561 | 234.0 | 0.4105 | 54.0 | 108.0 | 158.0 | 0.6835 | 0.3418 | 89.0 | 102.0 | 152.0 | 0.6711 | 0.5855 | 62.0 | 76.0 | 142.0 | 0.5352 | 0.4366 | 29.0 | 31.0 | 118.0 | 0.2627 | 0.2458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 31 | 4.2507 | 0.006 | 3495.4848 | 2422.8854 | 319.0 | 570.0 | 0.5596 | 232.0 | 0.4070 | 53.0 | 109.0 | 158.0 | 0.6899 | 0.3354 | 89.0 | 102.0 | 152.0 | 0.6711 | 0.5855 | 62.0 | 77.0 | 142.0 | 0.5423 | 0.4366 | 28.0 | 31.0 | 118.0 | 0.2627 | 0.2373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 32 | 4.2647 | 0.006 | 3507.0566 | 2430.9064 | 321.0 | 570.0 | 0.5632 | 235.0 | 0.4123 | 54.0 | 109.0 | 158.0 | 0.6899 | 0.3418 | 89.0 | 104.0 | 152.0 | 0.6842 | 0.5855 | 64.0 | 78.0 | 142.0 | 0.5493 | 0.4507 | 28.0 | 30.0 | 118.0 | 0.2542 | 0.2373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 33 | 4.2689 | 0.006 | 3510.5114 | 2433.3011 | 315.0 | 570.0 | 0.5526 | 230.0 | 0.4035 | 52.0 | 106.0 | 158.0 | 0.6709 | 0.3291 | 88.0 | 102.0 | 152.0 | 0.6711 | 0.5789 | 63.0 | 77.0 | 142.0 | 0.5423 | 0.4437 | 27.0 | 30.0 | 118.0 | 0.2542 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 34.0 | 34 | 4.2978 | 0.006 | 3534.2027 | 2449.7226 | 318.0 | 570.0 | 0.5579 | 233.0 | 0.4088 | 55.0 | 109.0 | 158.0 | 0.6899 | 0.3481 | 89.0 | 103.0 | 152.0 | 0.6776 | 0.5855 | 62.0 | 76.0 | 142.0 | 0.5352 | 0.4366 | 27.0 | 30.0 | 118.0 | 0.2542 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 35.0 | 35 | 4.2874 | 0.006 | 3525.6484 | 2443.7932 | 319.0 | 570.0 | 0.5596 | 233.0 | 0.4088 | 53.0 | 110.0 | 158.0 | 0.6962 | 0.3354 | 89.0 | 102.0 | 152.0 | 0.6711 | 0.5855 | 62.0 | 76.0 | 142.0 | 0.5352 | 0.4366 | 29.0 | 31.0 | 118.0 | 0.2627 | 0.2458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 4
Model tree for donoway/ARC-Easy_Llama-3.2-1B-qba6fe5a
Base model
meta-llama/Llama-3.2-1B