ARC-Easy_Llama-3.2-1B-l3w1y2gt
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.7675
- Model Preparation Time: 0.0056
- Mdl: 631.1500
- Accumulated Loss: 437.4799
- Correct Preds: 430.0
- Total Preds: 570.0
- Accuracy: 0.7544
- Correct Gen Preds: 430.0
- Gen Accuracy: 0.7544
- Correct Gen Preds 32: 120.0
- Correct Preds 32: 120.0
- Total Labels 32: 158.0
- Accuracy 32: 0.7595
- Gen Accuracy 32: 0.7595
- Correct Gen Preds 33: 118.0
- Correct Preds 33: 118.0
- Total Labels 33: 152.0
- Accuracy 33: 0.7763
- Gen Accuracy 33: 0.7763
- Correct Gen Preds 34: 110.0
- Correct Preds 34: 110.0
- Total Labels 34: 142.0
- Accuracy 34: 0.7746
- Gen Accuracy 34: 0.7746
- Correct Gen Preds 35: 82.0
- Correct Preds 35: 82.0
- Total Labels 35: 118.0
- Accuracy 35: 0.6949
- Gen Accuracy 35: 0.6949
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.001
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.0056 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.8193 | 1.0 | 17 | 0.8399 | 0.0056 | 690.6870 | 478.7477 | 401.0 | 570.0 | 0.7035 | 401.0 | 0.7035 | 106.0 | 106.0 | 158.0 | 0.6709 | 0.6709 | 106.0 | 106.0 | 152.0 | 0.6974 | 0.6974 | 103.0 | 103.0 | 142.0 | 0.7254 | 0.7254 | 86.0 | 86.0 | 118.0 | 0.7288 | 0.7288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.3755 | 2.0 | 34 | 0.7675 | 0.0056 | 631.1500 | 437.4799 | 430.0 | 570.0 | 0.7544 | 430.0 | 0.7544 | 120.0 | 120.0 | 158.0 | 0.7595 | 0.7595 | 118.0 | 118.0 | 152.0 | 0.7763 | 0.7763 | 110.0 | 110.0 | 142.0 | 0.7746 | 0.7746 | 82.0 | 82.0 | 118.0 | 0.6949 | 0.6949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0673 | 3.0 | 51 | 0.9258 | 0.0056 | 761.2801 | 527.6791 | 425.0 | 570.0 | 0.7456 | 424.0 | 0.7439 | 113.0 | 114.0 | 158.0 | 0.7215 | 0.7152 | 123.0 | 123.0 | 152.0 | 0.8092 | 0.8092 | 113.0 | 113.0 | 142.0 | 0.7958 | 0.7958 | 75.0 | 75.0 | 118.0 | 0.6356 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0163 | 4.0 | 68 | 1.1686 | 0.0056 | 961.0022 | 666.1160 | 410.0 | 570.0 | 0.7193 | 410.0 | 0.7193 | 125.0 | 125.0 | 158.0 | 0.7911 | 0.7911 | 113.0 | 113.0 | 152.0 | 0.7434 | 0.7434 | 105.0 | 105.0 | 142.0 | 0.7394 | 0.7394 | 67.0 | 67.0 | 118.0 | 0.5678 | 0.5678 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 5.0 | 85 | 2.5405 | 0.0056 | 2089.1473 | 1448.0865 | 406.0 | 570.0 | 0.7123 | 406.0 | 0.7123 | 99.0 | 99.0 | 158.0 | 0.6266 | 0.6266 | 129.0 | 129.0 | 152.0 | 0.8487 | 0.8487 | 102.0 | 102.0 | 142.0 | 0.7183 | 0.7183 | 76.0 | 76.0 | 118.0 | 0.6441 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0219 | 6.0 | 102 | 2.1967 | 0.0056 | 1806.4444 | 1252.1318 | 418.0 | 570.0 | 0.7333 | 418.0 | 0.7333 | 127.0 | 127.0 | 158.0 | 0.8038 | 0.8038 | 105.0 | 105.0 | 152.0 | 0.6908 | 0.6908 | 110.0 | 110.0 | 142.0 | 0.7746 | 0.7746 | 76.0 | 76.0 | 118.0 | 0.6441 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 7.0 | 119 | 2.6483 | 0.0056 | 2177.7596 | 1509.5079 | 414.0 | 570.0 | 0.7263 | 410.0 | 0.7193 | 101.0 | 103.0 | 158.0 | 0.6519 | 0.6392 | 125.0 | 125.0 | 152.0 | 0.8224 | 0.8224 | 106.0 | 107.0 | 142.0 | 0.7535 | 0.7465 | 78.0 | 79.0 | 118.0 | 0.6695 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.185 | 8.0 | 136 | 2.2471 | 0.0056 | 1847.8903 | 1280.8600 | 415.0 | 570.0 | 0.7281 | 415.0 | 0.7281 | 129.0 | 129.0 | 158.0 | 0.8165 | 0.8165 | 123.0 | 123.0 | 152.0 | 0.8092 | 0.8092 | 101.0 | 101.0 | 142.0 | 0.7113 | 0.7113 | 62.0 | 62.0 | 118.0 | 0.5254 | 0.5254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 153 | 2.7019 | 0.0056 | 2221.8581 | 1540.0747 | 418.0 | 570.0 | 0.7333 | 417.0 | 0.7316 | 112.0 | 113.0 | 158.0 | 0.7152 | 0.7089 | 131.0 | 131.0 | 152.0 | 0.8618 | 0.8618 | 103.0 | 103.0 | 142.0 | 0.7254 | 0.7254 | 71.0 | 71.0 | 118.0 | 0.6017 | 0.6017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 170 | 2.7311 | 0.0056 | 2245.8859 | 1556.7295 | 418.0 | 570.0 | 0.7333 | 418.0 | 0.7333 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 122.0 | 122.0 | 152.0 | 0.8026 | 0.8026 | 106.0 | 106.0 | 142.0 | 0.7465 | 0.7465 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 187 | 2.7509 | 0.0056 | 2262.1718 | 1568.0180 | 419.0 | 570.0 | 0.7351 | 419.0 | 0.7351 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 121.0 | 121.0 | 152.0 | 0.7961 | 0.7961 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 204 | 2.7518 | 0.0056 | 2262.9076 | 1568.5280 | 419.0 | 570.0 | 0.7351 | 419.0 | 0.7351 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 121.0 | 121.0 | 152.0 | 0.7961 | 0.7961 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 221 | 2.7606 | 0.0056 | 2270.1255 | 1573.5311 | 419.0 | 570.0 | 0.7351 | 419.0 | 0.7351 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 121.0 | 121.0 | 152.0 | 0.7961 | 0.7961 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 238 | 2.7827 | 0.0056 | 2288.3338 | 1586.1521 | 420.0 | 570.0 | 0.7368 | 420.0 | 0.7368 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 122.0 | 122.0 | 152.0 | 0.8026 | 0.8026 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 255 | 2.7809 | 0.0056 | 2286.8709 | 1585.1381 | 419.0 | 570.0 | 0.7351 | 418.0 | 0.7333 | 115.0 | 116.0 | 158.0 | 0.7342 | 0.7278 | 121.0 | 121.0 | 152.0 | 0.7961 | 0.7961 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 272 | 2.7799 | 0.0056 | 2286.0110 | 1584.5421 | 419.0 | 570.0 | 0.7351 | 419.0 | 0.7351 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 121.0 | 121.0 | 152.0 | 0.7961 | 0.7961 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 289 | 2.7880 | 0.0056 | 2292.6855 | 1589.1685 | 420.0 | 570.0 | 0.7368 | 420.0 | 0.7368 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 122.0 | 122.0 | 152.0 | 0.8026 | 0.8026 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 306 | 2.8077 | 0.0056 | 2308.8692 | 1600.3861 | 420.0 | 570.0 | 0.7368 | 420.0 | 0.7368 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 122.0 | 122.0 | 152.0 | 0.8026 | 0.8026 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 323 | 2.8043 | 0.0056 | 2306.1110 | 1598.4744 | 419.0 | 570.0 | 0.7351 | 419.0 | 0.7351 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 121.0 | 121.0 | 152.0 | 0.7961 | 0.7961 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 340 | 2.8029 | 0.0056 | 2304.8923 | 1597.6296 | 419.0 | 570.0 | 0.7351 | 418.0 | 0.7333 | 115.0 | 116.0 | 158.0 | 0.7342 | 0.7278 | 121.0 | 121.0 | 152.0 | 0.7961 | 0.7961 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 357 | 2.8202 | 0.0056 | 2319.1354 | 1607.5022 | 419.0 | 570.0 | 0.7351 | 419.0 | 0.7351 | 116.0 | 116.0 | 158.0 | 0.7342 | 0.7342 | 121.0 | 121.0 | 152.0 | 0.7961 | 0.7961 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 374 | 2.8085 | 0.0056 | 2309.5375 | 1600.8494 | 420.0 | 570.0 | 0.7368 | 419.0 | 0.7351 | 115.0 | 116.0 | 158.0 | 0.7342 | 0.7278 | 122.0 | 122.0 | 152.0 | 0.8026 | 0.8026 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 74.0 | 118.0 | 0.6271 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Easy_Llama-3.2-1B-l3w1y2gt
Base model
meta-llama/Llama-3.2-1B