ARC-Easy_Llama-3.2-1B-6jgnsuv6
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8919
- Model Preparation Time: 0.0056
- Mdl: 733.4736
- Accumulated Loss: 508.4052
- Correct Preds: 427.0
- Total Preds: 570.0
- Accuracy: 0.7491
- Correct Gen Preds: 427.0
- Gen Accuracy: 0.7491
- Correct Gen Preds 32: 129.0
- Correct Preds 32: 129.0
- Total Labels 32: 158.0
- Accuracy 32: 0.8165
- Gen Accuracy 32: 0.8165
- Correct Gen Preds 33: 108.0
- Correct Preds 33: 108.0
- Total Labels 33: 152.0
- Accuracy 33: 0.7105
- Gen Accuracy 33: 0.7105
- Correct Gen Preds 34: 115.0
- Correct Preds 34: 115.0
- Total Labels 34: 142.0
- Accuracy 34: 0.8099
- Gen Accuracy 34: 0.8099
- Correct Gen Preds 35: 75.0
- Correct Preds 35: 75.0
- Total Labels 35: 118.0
- Accuracy 35: 0.6356
- Gen Accuracy 35: 0.6356
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.001
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.0056 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.4726 | 1.0 | 25 | 0.8475 | 0.0056 | 696.9144 | 483.0642 | 394.0 | 570.0 | 0.6912 | 391.0 | 0.6860 | 87.0 | 90.0 | 158.0 | 0.5696 | 0.5506 | 104.0 | 104.0 | 152.0 | 0.6842 | 0.6842 | 109.0 | 109.0 | 142.0 | 0.7676 | 0.7676 | 91.0 | 91.0 | 118.0 | 0.7712 | 0.7712 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.7886 | 2.0 | 50 | 0.7247 | 0.0056 | 595.9247 | 413.0635 | 415.0 | 570.0 | 0.7281 | 415.0 | 0.7281 | 133.0 | 133.0 | 158.0 | 0.8418 | 0.8418 | 107.0 | 107.0 | 152.0 | 0.7039 | 0.7039 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 82.0 | 82.0 | 118.0 | 0.6949 | 0.6949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.1428 | 3.0 | 75 | 0.8919 | 0.0056 | 733.4736 | 508.4052 | 427.0 | 570.0 | 0.7491 | 427.0 | 0.7491 | 129.0 | 129.0 | 158.0 | 0.8165 | 0.8165 | 108.0 | 108.0 | 152.0 | 0.7105 | 0.7105 | 115.0 | 115.0 | 142.0 | 0.8099 | 0.8099 | 75.0 | 75.0 | 118.0 | 0.6356 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0066 | 4.0 | 100 | 1.4142 | 0.0056 | 1162.9830 | 806.1184 | 420.0 | 570.0 | 0.7368 | 403.0 | 0.7070 | 119.0 | 125.0 | 158.0 | 0.7911 | 0.7532 | 119.0 | 123.0 | 152.0 | 0.8092 | 0.7829 | 100.0 | 103.0 | 142.0 | 0.7254 | 0.7042 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0066 | 5.0 | 125 | 1.6364 | 0.0056 | 1345.6457 | 932.7305 | 406.0 | 570.0 | 0.7123 | 399.0 | 0.7 | 107.0 | 113.0 | 158.0 | 0.7152 | 0.6772 | 101.0 | 101.0 | 152.0 | 0.6645 | 0.6645 | 106.0 | 106.0 | 142.0 | 0.7465 | 0.7465 | 85.0 | 86.0 | 118.0 | 0.7288 | 0.7203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 6.0 | 150 | 2.3995 | 0.0056 | 1973.1559 | 1367.6875 | 407.0 | 570.0 | 0.7140 | 392.0 | 0.6877 | 93.0 | 104.0 | 158.0 | 0.6582 | 0.5886 | 113.0 | 114.0 | 152.0 | 0.75 | 0.7434 | 102.0 | 104.0 | 142.0 | 0.7324 | 0.7183 | 84.0 | 85.0 | 118.0 | 0.7203 | 0.7119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 7.0 | 175 | 2.5540 | 0.0056 | 2100.2596 | 1455.7890 | 414.0 | 570.0 | 0.7263 | 408.0 | 0.7158 | 108.0 | 113.0 | 158.0 | 0.7152 | 0.6835 | 117.0 | 117.0 | 152.0 | 0.7697 | 0.7697 | 102.0 | 102.0 | 142.0 | 0.7183 | 0.7183 | 81.0 | 82.0 | 118.0 | 0.6949 | 0.6864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 8.0 | 200 | 2.5711 | 0.0056 | 2114.2895 | 1465.5138 | 418.0 | 570.0 | 0.7333 | 410.0 | 0.7193 | 106.0 | 113.0 | 158.0 | 0.7152 | 0.6709 | 122.0 | 123.0 | 152.0 | 0.8092 | 0.8026 | 102.0 | 102.0 | 142.0 | 0.7183 | 0.7183 | 80.0 | 80.0 | 118.0 | 0.6780 | 0.6780 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 9.0 | 225 | 2.5896 | 0.0056 | 2129.5119 | 1476.0652 | 419.0 | 570.0 | 0.7351 | 410.0 | 0.7193 | 104.0 | 112.0 | 158.0 | 0.7089 | 0.6582 | 122.0 | 123.0 | 152.0 | 0.8092 | 0.8026 | 103.0 | 103.0 | 142.0 | 0.7254 | 0.7254 | 81.0 | 81.0 | 118.0 | 0.6864 | 0.6864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 250 | 2.6097 | 0.0056 | 2146.0783 | 1487.5481 | 419.0 | 570.0 | 0.7351 | 411.0 | 0.7211 | 105.0 | 112.0 | 158.0 | 0.7089 | 0.6646 | 122.0 | 123.0 | 152.0 | 0.8092 | 0.8026 | 102.0 | 102.0 | 142.0 | 0.7183 | 0.7183 | 82.0 | 82.0 | 118.0 | 0.6949 | 0.6949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 275 | 2.6133 | 0.0056 | 2149.0502 | 1489.6081 | 419.0 | 570.0 | 0.7351 | 411.0 | 0.7211 | 105.0 | 112.0 | 158.0 | 0.7089 | 0.6646 | 122.0 | 123.0 | 152.0 | 0.8092 | 0.8026 | 103.0 | 103.0 | 142.0 | 0.7254 | 0.7254 | 81.0 | 81.0 | 118.0 | 0.6864 | 0.6864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 300 | 2.6221 | 0.0056 | 2156.2876 | 1494.6247 | 418.0 | 570.0 | 0.7333 | 410.0 | 0.7193 | 105.0 | 112.0 | 158.0 | 0.7089 | 0.6646 | 122.0 | 123.0 | 152.0 | 0.8092 | 0.8026 | 102.0 | 102.0 | 142.0 | 0.7183 | 0.7183 | 81.0 | 81.0 | 118.0 | 0.6864 | 0.6864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 325 | 2.6192 | 0.0056 | 2153.8311 | 1492.9219 | 418.0 | 570.0 | 0.7333 | 410.0 | 0.7193 | 104.0 | 111.0 | 158.0 | 0.7025 | 0.6582 | 122.0 | 123.0 | 152.0 | 0.8092 | 0.8026 | 102.0 | 102.0 | 142.0 | 0.7183 | 0.7183 | 82.0 | 82.0 | 118.0 | 0.6949 | 0.6949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 350 | 2.6335 | 0.0056 | 2165.6088 | 1501.0857 | 419.0 | 570.0 | 0.7351 | 411.0 | 0.7211 | 106.0 | 113.0 | 158.0 | 0.7152 | 0.6709 | 122.0 | 123.0 | 152.0 | 0.8092 | 0.8026 | 102.0 | 102.0 | 142.0 | 0.7183 | 0.7183 | 81.0 | 81.0 | 118.0 | 0.6864 | 0.6864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 375 | 2.6250 | 0.0056 | 2158.6426 | 1496.2570 | 420.0 | 570.0 | 0.7368 | 412.0 | 0.7228 | 106.0 | 113.0 | 158.0 | 0.7152 | 0.6709 | 122.0 | 123.0 | 152.0 | 0.8092 | 0.8026 | 103.0 | 103.0 | 142.0 | 0.7254 | 0.7254 | 81.0 | 81.0 | 118.0 | 0.6864 | 0.6864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 400 | 2.6439 | 0.0056 | 2174.2071 | 1507.0456 | 419.0 | 570.0 | 0.7351 | 411.0 | 0.7211 | 105.0 | 112.0 | 158.0 | 0.7089 | 0.6646 | 122.0 | 123.0 | 152.0 | 0.8092 | 0.8026 | 103.0 | 103.0 | 142.0 | 0.7254 | 0.7254 | 81.0 | 81.0 | 118.0 | 0.6864 | 0.6864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 425 | 2.6435 | 0.0056 | 2173.8519 | 1506.7993 | 421.0 | 570.0 | 0.7386 | 413.0 | 0.7246 | 105.0 | 112.0 | 158.0 | 0.7089 | 0.6646 | 123.0 | 124.0 | 152.0 | 0.8158 | 0.8092 | 103.0 | 103.0 | 142.0 | 0.7254 | 0.7254 | 82.0 | 82.0 | 118.0 | 0.6949 | 0.6949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Easy_Llama-3.2-1B-6jgnsuv6
Base model
meta-llama/Llama-3.2-1B