ARC-Easy_Llama-3.2-1B-xl28q3hn
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.2555
- Model Preparation Time: 0.006
- Mdl: 1032.4540
- Accumulated Loss: 715.6426
- Correct Preds: 291.0
- Total Preds: 570.0
- Accuracy: 0.5105
- Correct Gen Preds: 291.0
- Gen Accuracy: 0.5105
- Correct Gen Preds 32: 98.0
- Correct Preds 32: 98.0
- Total Labels 32: 158.0
- Accuracy 32: 0.6203
- Gen Accuracy 32: 0.6203
- Correct Gen Preds 33: 130.0
- Correct Preds 33: 130.0
- Total Labels 33: 152.0
- Accuracy 33: 0.8553
- Gen Accuracy 33: 0.8553
- Correct Gen Preds 34: 40.0
- Correct Preds 34: 40.0
- Total Labels 34: 142.0
- Accuracy 34: 0.2817
- Gen Accuracy 34: 0.2817
- Correct Gen Preds 35: 23.0
- Correct Preds 35: 23.0
- Total Labels 35: 118.0
- Accuracy 35: 0.1949
- Gen Accuracy 35: 0.1949
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.006 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3552 | 1.0 | 1 | 1.5354 | 0.006 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3552 | 2.0 | 2 | 2.4687 | 0.006 | 2030.1287 | 1407.1780 | 221.0 | 570.0 | 0.3877 | 221.0 | 0.3877 | 0.0 | 0.0 | 158.0 | 0.0 | 0.0 | 85.0 | 85.0 | 152.0 | 0.5592 | 0.5592 | 136.0 | 136.0 | 142.0 | 0.9577 | 0.9577 | 0.0 | 0.0 | 118.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.7603 | 3.0 | 3 | 1.2555 | 0.006 | 1032.4540 | 715.6426 | 291.0 | 570.0 | 0.5105 | 291.0 | 0.5105 | 98.0 | 98.0 | 158.0 | 0.6203 | 0.6203 | 130.0 | 130.0 | 152.0 | 0.8553 | 0.8553 | 40.0 | 40.0 | 142.0 | 0.2817 | 0.2817 | 23.0 | 23.0 | 118.0 | 0.1949 | 0.1949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.4267 | 4.0 | 4 | 2.5733 | 0.006 | 2116.1258 | 1466.7867 | 261.0 | 570.0 | 0.4579 | 260.0 | 0.4561 | 151.0 | 152.0 | 158.0 | 0.9620 | 0.9557 | 39.0 | 39.0 | 152.0 | 0.2566 | 0.2566 | 42.0 | 42.0 | 142.0 | 0.2958 | 0.2958 | 28.0 | 28.0 | 118.0 | 0.2373 | 0.2373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0491 | 5.0 | 5 | 3.1596 | 0.006 | 2598.2545 | 1800.9728 | 284.0 | 570.0 | 0.4982 | 284.0 | 0.4982 | 151.0 | 151.0 | 158.0 | 0.9557 | 0.9557 | 56.0 | 56.0 | 152.0 | 0.3684 | 0.3684 | 50.0 | 50.0 | 142.0 | 0.3521 | 0.3521 | 27.0 | 27.0 | 118.0 | 0.2288 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0044 | 6.0 | 6 | 4.0391 | 0.006 | 3321.5305 | 2302.3095 | 262.0 | 570.0 | 0.4596 | 259.0 | 0.4544 | 151.0 | 152.0 | 158.0 | 0.9620 | 0.9557 | 41.0 | 41.0 | 152.0 | 0.2697 | 0.2697 | 44.0 | 45.0 | 142.0 | 0.3169 | 0.3099 | 23.0 | 24.0 | 118.0 | 0.2034 | 0.1949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 7.0 | 7 | 4.4151 | 0.006 | 3630.7350 | 2516.6338 | 253.0 | 570.0 | 0.4439 | 239.0 | 0.4193 | 144.0 | 152.0 | 158.0 | 0.9620 | 0.9114 | 36.0 | 38.0 | 152.0 | 0.25 | 0.2368 | 38.0 | 41.0 | 142.0 | 0.2887 | 0.2676 | 21.0 | 22.0 | 118.0 | 0.1864 | 0.1780 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 8.0 | 8 | 4.5569 | 0.006 | 3747.3361 | 2597.4554 | 250.0 | 570.0 | 0.4386 | 223.0 | 0.3912 | 135.0 | 154.0 | 158.0 | 0.9747 | 0.8544 | 35.0 | 38.0 | 152.0 | 0.25 | 0.2303 | 35.0 | 39.0 | 142.0 | 0.2746 | 0.2465 | 18.0 | 19.0 | 118.0 | 0.1610 | 0.1525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 9 | 4.6453 | 0.006 | 3819.9784 | 2647.8072 | 247.0 | 570.0 | 0.4333 | 204.0 | 0.3579 | 123.0 | 152.0 | 158.0 | 0.9620 | 0.7785 | 33.0 | 39.0 | 152.0 | 0.2566 | 0.2171 | 31.0 | 37.0 | 142.0 | 0.2606 | 0.2183 | 17.0 | 19.0 | 118.0 | 0.1610 | 0.1441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 10.0 | 10 | 4.8047 | 0.006 | 3951.0414 | 2738.6532 | 242.0 | 570.0 | 0.4246 | 203.0 | 0.3561 | 123.0 | 152.0 | 158.0 | 0.9620 | 0.7785 | 35.0 | 39.0 | 152.0 | 0.2566 | 0.2303 | 30.0 | 33.0 | 142.0 | 0.2324 | 0.2113 | 15.0 | 18.0 | 118.0 | 0.1525 | 0.1271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 11 | 5.0241 | 0.006 | 4131.5031 | 2863.7397 | 236.0 | 570.0 | 0.4140 | 201.0 | 0.3526 | 125.0 | 153.0 | 158.0 | 0.9684 | 0.7911 | 34.0 | 37.0 | 152.0 | 0.2434 | 0.2237 | 28.0 | 29.0 | 142.0 | 0.2042 | 0.1972 | 14.0 | 17.0 | 118.0 | 0.1441 | 0.1186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 12 | 5.2229 | 0.006 | 4295.0154 | 2977.0778 | 235.0 | 570.0 | 0.4123 | 203.0 | 0.3561 | 129.0 | 154.0 | 158.0 | 0.9747 | 0.8165 | 32.0 | 36.0 | 152.0 | 0.2368 | 0.2105 | 28.0 | 29.0 | 142.0 | 0.2042 | 0.1972 | 14.0 | 16.0 | 118.0 | 0.1356 | 0.1186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 13 | 5.3741 | 0.006 | 4419.3154 | 3063.2360 | 235.0 | 570.0 | 0.4123 | 202.0 | 0.3544 | 129.0 | 155.0 | 158.0 | 0.9810 | 0.8165 | 31.0 | 35.0 | 152.0 | 0.2303 | 0.2039 | 28.0 | 29.0 | 142.0 | 0.2042 | 0.1972 | 14.0 | 16.0 | 118.0 | 0.1356 | 0.1186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 14 | 5.5052 | 0.006 | 4527.0926 | 3137.9415 | 235.0 | 570.0 | 0.4123 | 207.0 | 0.3632 | 135.0 | 156.0 | 158.0 | 0.9873 | 0.8544 | 31.0 | 35.0 | 152.0 | 0.2303 | 0.2039 | 27.0 | 28.0 | 142.0 | 0.1972 | 0.1901 | 14.0 | 16.0 | 118.0 | 0.1356 | 0.1186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 15 | 5.5976 | 0.006 | 4603.0781 | 3190.6106 | 234.0 | 570.0 | 0.4105 | 207.0 | 0.3632 | 135.0 | 156.0 | 158.0 | 0.9873 | 0.8544 | 32.0 | 35.0 | 152.0 | 0.2303 | 0.2105 | 26.0 | 28.0 | 142.0 | 0.1972 | 0.1831 | 14.0 | 15.0 | 118.0 | 0.1271 | 0.1186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 16 | 5.6853 | 0.006 | 4675.2022 | 3240.6032 | 228.0 | 570.0 | 0.4 | 206.0 | 0.3614 | 138.0 | 155.0 | 158.0 | 0.9810 | 0.8734 | 29.0 | 32.0 | 152.0 | 0.2105 | 0.1908 | 26.0 | 27.0 | 142.0 | 0.1901 | 0.1831 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 17 | 5.7800 | 0.006 | 4753.1165 | 3294.6093 | 228.0 | 570.0 | 0.4 | 207.0 | 0.3632 | 141.0 | 156.0 | 158.0 | 0.9873 | 0.8924 | 29.0 | 31.0 | 152.0 | 0.2039 | 0.1908 | 24.0 | 27.0 | 142.0 | 0.1901 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 18 | 5.8437 | 0.006 | 4805.4763 | 3330.9024 | 227.0 | 570.0 | 0.3982 | 207.0 | 0.3632 | 141.0 | 156.0 | 158.0 | 0.9873 | 0.8924 | 29.0 | 30.0 | 152.0 | 0.1974 | 0.1908 | 24.0 | 27.0 | 142.0 | 0.1901 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 19 | 5.9488 | 0.006 | 4891.9541 | 3390.8442 | 226.0 | 570.0 | 0.3965 | 206.0 | 0.3614 | 141.0 | 156.0 | 158.0 | 0.9873 | 0.8924 | 28.0 | 29.0 | 152.0 | 0.1908 | 0.1842 | 24.0 | 27.0 | 142.0 | 0.1901 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 20 | 5.9804 | 0.006 | 4917.8580 | 3408.7994 | 226.0 | 570.0 | 0.3965 | 206.0 | 0.3614 | 141.0 | 156.0 | 158.0 | 0.9873 | 0.8924 | 28.0 | 29.0 | 152.0 | 0.1908 | 0.1842 | 24.0 | 27.0 | 142.0 | 0.1901 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 21 | 6.0239 | 0.006 | 4953.6373 | 3433.5997 | 226.0 | 570.0 | 0.3965 | 206.0 | 0.3614 | 141.0 | 156.0 | 158.0 | 0.9873 | 0.8924 | 28.0 | 29.0 | 152.0 | 0.1908 | 0.1842 | 24.0 | 27.0 | 142.0 | 0.1901 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 22 | 6.0758 | 0.006 | 4996.3676 | 3463.2181 | 225.0 | 570.0 | 0.3947 | 206.0 | 0.3614 | 141.0 | 156.0 | 158.0 | 0.9873 | 0.8924 | 28.0 | 29.0 | 152.0 | 0.1908 | 0.1842 | 24.0 | 26.0 | 142.0 | 0.1831 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 23 | 6.0958 | 0.006 | 5012.8294 | 3474.6285 | 225.0 | 570.0 | 0.3947 | 207.0 | 0.3632 | 141.0 | 156.0 | 158.0 | 0.9873 | 0.8924 | 28.0 | 29.0 | 152.0 | 0.1908 | 0.1842 | 25.0 | 26.0 | 142.0 | 0.1831 | 0.1761 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 24 | 6.1508 | 0.006 | 5057.9994 | 3505.9380 | 225.0 | 570.0 | 0.3947 | 209.0 | 0.3667 | 144.0 | 156.0 | 158.0 | 0.9873 | 0.9114 | 28.0 | 29.0 | 152.0 | 0.1908 | 0.1842 | 24.0 | 26.0 | 142.0 | 0.1831 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 25 | 6.1477 | 0.006 | 5055.4455 | 3504.1678 | 224.0 | 570.0 | 0.3930 | 208.0 | 0.3649 | 143.0 | 156.0 | 158.0 | 0.9873 | 0.9051 | 27.0 | 28.0 | 152.0 | 0.1842 | 0.1776 | 25.0 | 26.0 | 142.0 | 0.1831 | 0.1761 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 26 | 6.1921 | 0.006 | 5092.0041 | 3529.5083 | 224.0 | 570.0 | 0.3930 | 208.0 | 0.3649 | 144.0 | 156.0 | 158.0 | 0.9873 | 0.9114 | 26.0 | 27.0 | 152.0 | 0.1776 | 0.1711 | 25.0 | 27.0 | 142.0 | 0.1901 | 0.1761 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 27 | 6.2041 | 0.006 | 5101.8523 | 3536.3346 | 224.0 | 570.0 | 0.3930 | 208.0 | 0.3649 | 144.0 | 156.0 | 158.0 | 0.9873 | 0.9114 | 26.0 | 27.0 | 152.0 | 0.1776 | 0.1711 | 25.0 | 27.0 | 142.0 | 0.1901 | 0.1761 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 28 | 6.2060 | 0.006 | 5103.4059 | 3537.4114 | 225.0 | 570.0 | 0.3947 | 208.0 | 0.3649 | 144.0 | 156.0 | 158.0 | 0.9873 | 0.9114 | 27.0 | 28.0 | 152.0 | 0.1842 | 0.1776 | 24.0 | 27.0 | 142.0 | 0.1901 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 29 | 6.2192 | 0.006 | 5114.2474 | 3544.9262 | 225.0 | 570.0 | 0.3947 | 209.0 | 0.3667 | 145.0 | 156.0 | 158.0 | 0.9873 | 0.9177 | 27.0 | 28.0 | 152.0 | 0.1842 | 0.1776 | 24.0 | 27.0 | 142.0 | 0.1901 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 30 | 6.2327 | 0.006 | 5125.3556 | 3552.6258 | 221.0 | 570.0 | 0.3877 | 206.0 | 0.3614 | 144.0 | 156.0 | 158.0 | 0.9873 | 0.9114 | 26.0 | 27.0 | 152.0 | 0.1776 | 0.1711 | 23.0 | 25.0 | 142.0 | 0.1761 | 0.1620 | 13.0 | 13.0 | 118.0 | 0.1102 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 31 | 6.2450 | 0.006 | 5135.5071 | 3559.6623 | 222.0 | 570.0 | 0.3895 | 206.0 | 0.3614 | 144.0 | 156.0 | 158.0 | 0.9873 | 0.9114 | 26.0 | 27.0 | 152.0 | 0.1776 | 0.1711 | 23.0 | 26.0 | 142.0 | 0.1831 | 0.1620 | 13.0 | 13.0 | 118.0 | 0.1102 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 32 | 6.2478 | 0.006 | 5137.7630 | 3561.2259 | 224.0 | 570.0 | 0.3930 | 210.0 | 0.3684 | 146.0 | 156.0 | 158.0 | 0.9873 | 0.9241 | 27.0 | 28.0 | 152.0 | 0.1842 | 0.1776 | 24.0 | 26.0 | 142.0 | 0.1831 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 33 | 6.2653 | 0.006 | 5152.1581 | 3571.2038 | 224.0 | 570.0 | 0.3930 | 209.0 | 0.3667 | 146.0 | 156.0 | 158.0 | 0.9873 | 0.9241 | 26.0 | 27.0 | 152.0 | 0.1776 | 0.1711 | 24.0 | 27.0 | 142.0 | 0.1901 | 0.1690 | 13.0 | 14.0 | 118.0 | 0.1186 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Easy_Llama-3.2-1B-xl28q3hn
Base model
meta-llama/Llama-3.2-1B