ARC-Challenge_Llama-3.2-1B-mcj1x0k2
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.9982
- Model Preparation Time: 0.006
- Mdl: 1724.6663
- Accumulated Loss: 1195.4476
- Correct Preds: 158.0
- Total Preds: 299.0
- Accuracy: 0.5284
- Correct Gen Preds: 158.0
- Gen Accuracy: 0.5284
- Correct Gen Preds 32: 30.0
- Correct Preds 32: 30.0
- Total Labels 32: 64.0
- Accuracy 32: 0.4688
- Gen Accuracy 32: 0.4688
- Correct Gen Preds 33: 38.0
- Correct Preds 33: 38.0
- Total Labels 33: 73.0
- Accuracy 33: 0.5205
- Gen Accuracy 33: 0.5205
- Correct Gen Preds 34: 45.0
- Correct Preds 34: 45.0
- Total Labels 34: 78.0
- Accuracy 34: 0.5769
- Gen Accuracy 34: 0.5769
- Correct Gen Preds 35: 44.0
- Correct Preds 35: 44.0
- Total Labels 35: 83.0
- Accuracy 35: 0.5301
- Gen Accuracy 35: 0.5301
- Correct Gen Preds 36: 1.0
- Correct Preds 36: 1.0
- Total Labels 36: 1.0
- Accuracy 36: 1.0
- Gen Accuracy 36: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.6389 | 0.006 | 706.9523 | 490.0220 | 66.0 | 299.0 | 0.2207 | 66.0 | 0.2207 | 62.0 | 62.0 | 64.0 | 0.9688 | 0.9688 | 0.0 | 0.0 | 73.0 | 0.0 | 0.0 | 4.0 | 4.0 | 78.0 | 0.0513 | 0.0513 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.2968 | 1.0 | 14 | 1.2857 | 0.006 | 554.6267 | 384.4379 | 127.0 | 299.0 | 0.4247 | 127.0 | 0.4247 | 15.0 | 15.0 | 64.0 | 0.2344 | 0.2344 | 54.0 | 54.0 | 73.0 | 0.7397 | 0.7397 | 33.0 | 33.0 | 78.0 | 0.4231 | 0.4231 | 25.0 | 25.0 | 83.0 | 0.3012 | 0.3012 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.0197 | 2.0 | 28 | 1.2629 | 0.006 | 544.7770 | 377.6107 | 142.0 | 299.0 | 0.4749 | 114.0 | 0.3813 | 22.0 | 27.0 | 64.0 | 0.4219 | 0.3438 | 28.0 | 33.0 | 73.0 | 0.4521 | 0.3836 | 26.0 | 33.0 | 78.0 | 0.4231 | 0.3333 | 38.0 | 49.0 | 83.0 | 0.5904 | 0.4578 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.4329 | 3.0 | 42 | 1.5725 | 0.006 | 678.3051 | 470.1653 | 139.0 | 299.0 | 0.4649 | 137.0 | 0.4582 | 41.0 | 41.0 | 64.0 | 0.6406 | 0.6406 | 33.0 | 33.0 | 73.0 | 0.4521 | 0.4521 | 30.0 | 30.0 | 78.0 | 0.3846 | 0.3846 | 33.0 | 35.0 | 83.0 | 0.4217 | 0.3976 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.3805 | 4.0 | 56 | 2.0751 | 0.006 | 895.1380 | 620.4624 | 147.0 | 299.0 | 0.4916 | 142.0 | 0.4749 | 35.0 | 36.0 | 64.0 | 0.5625 | 0.5469 | 46.0 | 47.0 | 73.0 | 0.6438 | 0.6301 | 31.0 | 31.0 | 78.0 | 0.3974 | 0.3974 | 30.0 | 33.0 | 83.0 | 0.3976 | 0.3614 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.1164 | 5.0 | 70 | 2.7052 | 0.006 | 1166.9296 | 808.8540 | 151.0 | 299.0 | 0.5050 | 151.0 | 0.5050 | 30.0 | 30.0 | 64.0 | 0.4688 | 0.4688 | 44.0 | 44.0 | 73.0 | 0.6027 | 0.6027 | 48.0 | 48.0 | 78.0 | 0.6154 | 0.6154 | 29.0 | 29.0 | 83.0 | 0.3494 | 0.3494 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.1753 | 6.0 | 84 | 4.8753 | 0.006 | 2103.0376 | 1457.7146 | 157.0 | 299.0 | 0.5251 | 156.0 | 0.5217 | 34.0 | 34.0 | 64.0 | 0.5312 | 0.5312 | 41.0 | 41.0 | 73.0 | 0.5616 | 0.5616 | 39.0 | 39.0 | 78.0 | 0.5 | 0.5 | 42.0 | 43.0 | 83.0 | 0.5181 | 0.5060 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 7.0 | 98 | 5.1286 | 0.006 | 2212.3174 | 1533.4615 | 142.0 | 299.0 | 0.4749 | 142.0 | 0.4749 | 24.0 | 24.0 | 64.0 | 0.375 | 0.375 | 40.0 | 40.0 | 73.0 | 0.5479 | 0.5479 | 43.0 | 43.0 | 78.0 | 0.5513 | 0.5513 | 34.0 | 34.0 | 83.0 | 0.4096 | 0.4096 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0001 | 8.0 | 112 | 3.9982 | 0.006 | 1724.6663 | 1195.4476 | 158.0 | 299.0 | 0.5284 | 158.0 | 0.5284 | 30.0 | 30.0 | 64.0 | 0.4688 | 0.4688 | 38.0 | 38.0 | 73.0 | 0.5205 | 0.5205 | 45.0 | 45.0 | 78.0 | 0.5769 | 0.5769 | 44.0 | 44.0 | 83.0 | 0.5301 | 0.5301 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0001 | 9.0 | 126 | 4.6491 | 0.006 | 2005.4651 | 1390.0825 | 143.0 | 299.0 | 0.4783 | 143.0 | 0.4783 | 17.0 | 17.0 | 64.0 | 0.2656 | 0.2656 | 42.0 | 42.0 | 73.0 | 0.5753 | 0.5753 | 49.0 | 49.0 | 78.0 | 0.6282 | 0.6282 | 34.0 | 34.0 | 83.0 | 0.4096 | 0.4096 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 10.0 | 140 | 5.2947 | 0.006 | 2283.9367 | 1583.1043 | 146.0 | 299.0 | 0.4883 | 146.0 | 0.4883 | 21.0 | 21.0 | 64.0 | 0.3281 | 0.3281 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 45.0 | 45.0 | 78.0 | 0.5769 | 0.5769 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 11.0 | 154 | 5.4408 | 0.006 | 2346.9894 | 1626.8091 | 146.0 | 299.0 | 0.4883 | 146.0 | 0.4883 | 22.0 | 22.0 | 64.0 | 0.3438 | 0.3438 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 44.0 | 44.0 | 78.0 | 0.5641 | 0.5641 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 12.0 | 168 | 5.4632 | 0.006 | 2356.6321 | 1633.4929 | 144.0 | 299.0 | 0.4816 | 144.0 | 0.4816 | 20.0 | 20.0 | 64.0 | 0.3125 | 0.3125 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 44.0 | 44.0 | 78.0 | 0.5641 | 0.5641 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 13.0 | 182 | 5.5015 | 0.006 | 2373.1372 | 1644.9333 | 143.0 | 299.0 | 0.4783 | 143.0 | 0.4783 | 19.0 | 19.0 | 64.0 | 0.2969 | 0.2969 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 44.0 | 44.0 | 78.0 | 0.5641 | 0.5641 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 14.0 | 196 | 5.5418 | 0.006 | 2390.5593 | 1657.0095 | 142.0 | 299.0 | 0.4749 | 142.0 | 0.4749 | 19.0 | 19.0 | 64.0 | 0.2969 | 0.2969 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 44.0 | 44.0 | 78.0 | 0.5641 | 0.5641 | 35.0 | 35.0 | 83.0 | 0.4217 | 0.4217 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 15.0 | 210 | 5.5438 | 0.006 | 2391.4120 | 1657.6005 | 144.0 | 299.0 | 0.4816 | 144.0 | 0.4816 | 19.0 | 19.0 | 64.0 | 0.2969 | 0.2969 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 45.0 | 45.0 | 78.0 | 0.5769 | 0.5769 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 16.0 | 224 | 5.5529 | 0.006 | 2395.3357 | 1660.3202 | 143.0 | 299.0 | 0.4783 | 143.0 | 0.4783 | 19.0 | 19.0 | 64.0 | 0.2969 | 0.2969 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 44.0 | 44.0 | 78.0 | 0.5641 | 0.5641 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 17.0 | 238 | 5.6177 | 0.006 | 2423.3017 | 1679.7047 | 142.0 | 299.0 | 0.4749 | 142.0 | 0.4749 | 18.0 | 18.0 | 64.0 | 0.2812 | 0.2812 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 44.0 | 44.0 | 78.0 | 0.5641 | 0.5641 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 18.0 | 252 | 5.6049 | 0.006 | 2417.7511 | 1675.8574 | 144.0 | 299.0 | 0.4816 | 144.0 | 0.4816 | 20.0 | 20.0 | 64.0 | 0.3125 | 0.3125 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 44.0 | 44.0 | 78.0 | 0.5641 | 0.5641 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 19.0 | 266 | 5.6022 | 0.006 | 2416.5775 | 1675.0439 | 145.0 | 299.0 | 0.4849 | 145.0 | 0.4849 | 21.0 | 21.0 | 64.0 | 0.3281 | 0.3281 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 45.0 | 45.0 | 78.0 | 0.5769 | 0.5769 | 35.0 | 35.0 | 83.0 | 0.4217 | 0.4217 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 20.0 | 280 | 5.6090 | 0.006 | 2419.5486 | 1677.1033 | 145.0 | 299.0 | 0.4849 | 145.0 | 0.4849 | 21.0 | 21.0 | 64.0 | 0.3281 | 0.3281 | 43.0 | 43.0 | 73.0 | 0.5890 | 0.5890 | 45.0 | 45.0 | 78.0 | 0.5769 | 0.5769 | 36.0 | 36.0 | 83.0 | 0.4337 | 0.4337 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Challenge_Llama-3.2-1B-mcj1x0k2
Base model
meta-llama/Llama-3.2-1B