ARC-Challenge_Llama-3.2-1B-kt9mvrzd
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 4.8039
- Model Preparation Time: 0.0059
- Mdl: 2072.2462
- Accumulated Loss: 1436.3716
- Correct Preds: 141.0
- Total Preds: 299.0
- Accuracy: 0.4716
- Correct Gen Preds: 135.0
- Gen Accuracy: 0.4515
- Correct Gen Preds 32: 18.0
- Correct Preds 32: 21.0
- Total Labels 32: 64.0
- Accuracy 32: 0.3281
- Gen Accuracy 32: 0.2812
- Correct Gen Preds 33: 29.0
- Correct Preds 33: 29.0
- Total Labels 33: 73.0
- Accuracy 33: 0.3973
- Gen Accuracy 33: 0.3973
- Correct Gen Preds 34: 41.0
- Correct Preds 34: 41.0
- Total Labels 34: 78.0
- Accuracy 34: 0.5256
- Gen Accuracy 34: 0.5256
- Correct Gen Preds 35: 46.0
- Correct Preds 35: 49.0
- Total Labels 35: 83.0
- Accuracy 35: 0.5904
- Gen Accuracy 35: 0.5542
- Correct Gen Preds 36: 1.0
- Correct Preds 36: 1.0
- Total Labels 36: 1.0
- Accuracy 36: 1.0
- Gen Accuracy 36: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.6389 | 0.0059 | 706.9523 | 490.0220 | 66.0 | 299.0 | 0.2207 | 66.0 | 0.2207 | 62.0 | 62.0 | 64.0 | 0.9688 | 0.9688 | 0.0 | 0.0 | 73.0 | 0.0 | 0.0 | 4.0 | 4.0 | 78.0 | 0.0513 | 0.0513 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.3273 | 1.0 | 7 | 1.4277 | 0.0059 | 615.8521 | 426.8761 | 100.0 | 299.0 | 0.3344 | 100.0 | 0.3344 | 37.0 | 37.0 | 64.0 | 0.5781 | 0.5781 | 26.0 | 26.0 | 73.0 | 0.3562 | 0.3562 | 35.0 | 35.0 | 78.0 | 0.4487 | 0.4487 | 2.0 | 2.0 | 83.0 | 0.0241 | 0.0241 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.09 | 2.0 | 14 | 1.3289 | 0.0059 | 573.2344 | 397.3358 | 128.0 | 299.0 | 0.4281 | 103.0 | 0.3445 | 18.0 | 22.0 | 64.0 | 0.3438 | 0.2812 | 25.0 | 36.0 | 73.0 | 0.4932 | 0.3425 | 28.0 | 31.0 | 78.0 | 0.3974 | 0.3590 | 32.0 | 39.0 | 83.0 | 0.4699 | 0.3855 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.6867 | 3.0 | 21 | 1.6961 | 0.0059 | 731.6372 | 507.1322 | 130.0 | 299.0 | 0.4348 | 127.0 | 0.4247 | 36.0 | 38.0 | 64.0 | 0.5938 | 0.5625 | 25.0 | 25.0 | 73.0 | 0.3425 | 0.3425 | 30.0 | 31.0 | 78.0 | 0.3974 | 0.3846 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.4254 | 4.0 | 28 | 2.2419 | 0.0059 | 967.0717 | 670.3230 | 139.0 | 299.0 | 0.4649 | 139.0 | 0.4649 | 20.0 | 20.0 | 64.0 | 0.3125 | 0.3125 | 38.0 | 38.0 | 73.0 | 0.5205 | 0.5205 | 41.0 | 41.0 | 78.0 | 0.5256 | 0.5256 | 39.0 | 39.0 | 83.0 | 0.4699 | 0.4699 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0216 | 5.0 | 35 | 2.5700 | 0.0059 | 1108.6037 | 768.4255 | 139.0 | 299.0 | 0.4649 | 139.0 | 0.4649 | 22.0 | 22.0 | 64.0 | 0.3438 | 0.3438 | 30.0 | 30.0 | 73.0 | 0.4110 | 0.4110 | 42.0 | 42.0 | 78.0 | 0.5385 | 0.5385 | 44.0 | 44.0 | 83.0 | 0.5301 | 0.5301 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0001 | 6.0 | 42 | 4.8039 | 0.0059 | 2072.2462 | 1436.3716 | 141.0 | 299.0 | 0.4716 | 135.0 | 0.4515 | 18.0 | 21.0 | 64.0 | 0.3281 | 0.2812 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 41.0 | 41.0 | 78.0 | 0.5256 | 0.5256 | 46.0 | 49.0 | 83.0 | 0.5904 | 0.5542 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 7.0 | 49 | 5.7685 | 0.0059 | 2488.3380 | 1724.7845 | 138.0 | 299.0 | 0.4615 | 129.0 | 0.4314 | 17.0 | 23.0 | 64.0 | 0.3594 | 0.2656 | 30.0 | 30.0 | 73.0 | 0.4110 | 0.4110 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 44.0 | 47.0 | 83.0 | 0.5663 | 0.5301 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 8.0 | 56 | 6.1056 | 0.0059 | 2633.7308 | 1825.5631 | 136.0 | 299.0 | 0.4548 | 130.0 | 0.4348 | 18.0 | 21.0 | 64.0 | 0.3281 | 0.2812 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 40.0 | 40.0 | 78.0 | 0.5128 | 0.5128 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 63 | 6.1872 | 0.0059 | 2668.9666 | 1849.9867 | 136.0 | 299.0 | 0.4548 | 130.0 | 0.4348 | 21.0 | 24.0 | 64.0 | 0.375 | 0.3281 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 70 | 6.2393 | 0.0059 | 2691.4055 | 1865.5401 | 136.0 | 299.0 | 0.4548 | 130.0 | 0.4348 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 44.0 | 47.0 | 83.0 | 0.5663 | 0.5301 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 77 | 6.2807 | 0.0059 | 2709.2982 | 1877.9424 | 136.0 | 299.0 | 0.4548 | 130.0 | 0.4348 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 44.0 | 47.0 | 83.0 | 0.5663 | 0.5301 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 84 | 6.2496 | 0.0059 | 2695.8733 | 1868.6370 | 134.0 | 299.0 | 0.4482 | 128.0 | 0.4281 | 18.0 | 21.0 | 64.0 | 0.3281 | 0.2812 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 91 | 6.2153 | 0.0059 | 2681.0845 | 1858.3861 | 138.0 | 299.0 | 0.4615 | 132.0 | 0.4415 | 21.0 | 24.0 | 64.0 | 0.375 | 0.3281 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 98 | 6.2351 | 0.0059 | 2689.6093 | 1864.2951 | 137.0 | 299.0 | 0.4582 | 130.0 | 0.4348 | 19.0 | 23.0 | 64.0 | 0.3594 | 0.2969 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 105 | 6.3139 | 0.0059 | 2723.5808 | 1887.8423 | 136.0 | 299.0 | 0.4548 | 131.0 | 0.4381 | 20.0 | 22.0 | 64.0 | 0.3438 | 0.3125 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 112 | 6.2494 | 0.0059 | 2695.7664 | 1868.5629 | 136.0 | 299.0 | 0.4548 | 130.0 | 0.4348 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 44.0 | 47.0 | 83.0 | 0.5663 | 0.5301 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 119 | 6.2340 | 0.0059 | 2689.1309 | 1863.9635 | 135.0 | 299.0 | 0.4515 | 129.0 | 0.4314 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 42.0 | 45.0 | 83.0 | 0.5422 | 0.5060 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 126 | 6.2425 | 0.0059 | 2692.7814 | 1866.4938 | 136.0 | 299.0 | 0.4548 | 130.0 | 0.4348 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 133 | 6.2613 | 0.0059 | 2700.8908 | 1872.1149 | 137.0 | 299.0 | 0.4582 | 131.0 | 0.4381 | 20.0 | 23.0 | 64.0 | 0.3594 | 0.3125 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 140 | 6.2458 | 0.0059 | 2694.2261 | 1867.4952 | 136.0 | 299.0 | 0.4548 | 130.0 | 0.4348 | 20.0 | 23.0 | 64.0 | 0.3594 | 0.3125 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 147 | 6.2619 | 0.0059 | 2701.1605 | 1872.3018 | 133.0 | 299.0 | 0.4448 | 128.0 | 0.4281 | 19.0 | 21.0 | 64.0 | 0.3281 | 0.2969 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 154 | 6.2457 | 0.0059 | 2694.1620 | 1867.4508 | 137.0 | 299.0 | 0.4582 | 131.0 | 0.4381 | 20.0 | 23.0 | 64.0 | 0.3594 | 0.3125 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 161 | 6.2820 | 0.0059 | 2709.8302 | 1878.3112 | 134.0 | 299.0 | 0.4482 | 128.0 | 0.4281 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 168 | 6.2623 | 0.0059 | 2701.3554 | 1872.4369 | 137.0 | 299.0 | 0.4582 | 131.0 | 0.4381 | 20.0 | 23.0 | 64.0 | 0.3594 | 0.3125 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 44.0 | 47.0 | 83.0 | 0.5663 | 0.5301 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 175 | 6.3135 | 0.0059 | 2723.4079 | 1887.7225 | 135.0 | 299.0 | 0.4515 | 129.0 | 0.4314 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 182 | 6.2799 | 0.0059 | 2708.9304 | 1877.6875 | 132.0 | 299.0 | 0.4415 | 126.0 | 0.4214 | 18.0 | 21.0 | 64.0 | 0.3281 | 0.2812 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 42.0 | 45.0 | 83.0 | 0.5422 | 0.5060 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 189 | 6.2581 | 0.0059 | 2699.5354 | 1871.1753 | 136.0 | 299.0 | 0.4548 | 130.0 | 0.4348 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 30.0 | 30.0 | 73.0 | 0.4110 | 0.4110 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 196 | 6.2766 | 0.0059 | 2707.5092 | 1876.7024 | 138.0 | 299.0 | 0.4615 | 132.0 | 0.4415 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 30.0 | 30.0 | 73.0 | 0.4110 | 0.4110 | 40.0 | 40.0 | 78.0 | 0.5128 | 0.5128 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 203 | 6.3278 | 0.0059 | 2729.5958 | 1892.0117 | 134.0 | 299.0 | 0.4482 | 128.0 | 0.4281 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 210 | 6.3162 | 0.0059 | 2724.5867 | 1888.5396 | 133.0 | 299.0 | 0.4448 | 127.0 | 0.4247 | 18.0 | 21.0 | 64.0 | 0.3281 | 0.2812 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 217 | 6.3389 | 0.0059 | 2734.3720 | 1895.3222 | 135.0 | 299.0 | 0.4515 | 129.0 | 0.4314 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 38.0 | 38.0 | 78.0 | 0.4872 | 0.4872 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 224 | 6.2881 | 0.0059 | 2712.4858 | 1880.1519 | 134.0 | 299.0 | 0.4482 | 128.0 | 0.4281 | 19.0 | 22.0 | 64.0 | 0.3438 | 0.2969 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 43.0 | 46.0 | 83.0 | 0.5542 | 0.5181 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Challenge_Llama-3.2-1B-kt9mvrzd
Base model
meta-llama/Llama-3.2-1B