ARC-Easy_Llama-3.2-1B-yn0mux6w
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.7219
- Model Preparation Time: 0.0062
- Mdl: 2238.3307
- Accumulated Loss: 1551.4926
- Correct Preds: 386.0
- Total Preds: 570.0
- Accuracy: 0.6772
- Correct Gen Preds: 367.0
- Gen Accuracy: 0.6439
- Correct Gen Preds 32: 95.0
- Correct Preds 32: 106.0
- Total Labels 32: 158.0
- Accuracy 32: 0.6709
- Gen Accuracy 32: 0.6013
- Correct Gen Preds 33: 101.0
- Correct Preds 33: 103.0
- Total Labels 33: 152.0
- Accuracy 33: 0.6776
- Gen Accuracy 33: 0.6645
- Correct Gen Preds 34: 102.0
- Correct Preds 34: 106.0
- Total Labels 34: 142.0
- Accuracy 34: 0.7465
- Gen Accuracy 34: 0.7183
- Correct Gen Preds 35: 69.0
- Correct Preds 35: 71.0
- Total Labels 35: 118.0
- Accuracy 35: 0.6017
- Gen Accuracy 35: 0.5847
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.0062 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3774 | 1.0 | 1 | 1.5354 | 0.0062 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3774 | 2.0 | 2 | 2.5968 | 0.0062 | 2135.4404 | 1480.1745 | 155.0 | 570.0 | 0.2719 | 155.0 | 0.2719 | 0.0 | 0.0 | 158.0 | 0.0 | 0.0 | 152.0 | 152.0 | 152.0 | 1.0 | 1.0 | 3.0 | 3.0 | 142.0 | 0.0211 | 0.0211 | 0.0 | 0.0 | 118.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9382 | 3.0 | 3 | 1.6496 | 0.0062 | 1356.5201 | 940.2681 | 222.0 | 570.0 | 0.3895 | 222.0 | 0.3895 | 109.0 | 109.0 | 158.0 | 0.6899 | 0.6899 | 112.0 | 112.0 | 152.0 | 0.7368 | 0.7368 | 1.0 | 1.0 | 142.0 | 0.0070 | 0.0070 | 0.0 | 0.0 | 118.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.6757 | 4.0 | 4 | 1.5680 | 0.0062 | 1289.4396 | 893.7715 | 264.0 | 570.0 | 0.4632 | 263.0 | 0.4614 | 146.0 | 147.0 | 158.0 | 0.9304 | 0.9241 | 20.0 | 20.0 | 152.0 | 0.1316 | 0.1316 | 70.0 | 70.0 | 142.0 | 0.4930 | 0.4930 | 27.0 | 27.0 | 118.0 | 0.2288 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.2038 | 5.0 | 5 | 1.2887 | 0.0062 | 1059.7154 | 734.5388 | 383.0 | 570.0 | 0.6719 | 374.0 | 0.6561 | 104.0 | 108.0 | 158.0 | 0.6835 | 0.6582 | 93.0 | 96.0 | 152.0 | 0.6316 | 0.6118 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 69.0 | 71.0 | 118.0 | 0.6017 | 0.5847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0105 | 6.0 | 6 | 2.0876 | 0.0062 | 1716.6885 | 1189.9178 | 384.0 | 570.0 | 0.6737 | 369.0 | 0.6474 | 95.0 | 105.0 | 158.0 | 0.6646 | 0.6013 | 100.0 | 102.0 | 152.0 | 0.6711 | 0.6579 | 104.0 | 106.0 | 142.0 | 0.7465 | 0.7324 | 70.0 | 71.0 | 118.0 | 0.6017 | 0.5932 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 7.0 | 7 | 2.7219 | 0.0062 | 2238.3307 | 1551.4926 | 386.0 | 570.0 | 0.6772 | 367.0 | 0.6439 | 95.0 | 106.0 | 158.0 | 0.6709 | 0.6013 | 101.0 | 103.0 | 152.0 | 0.6776 | 0.6645 | 102.0 | 106.0 | 142.0 | 0.7465 | 0.7183 | 69.0 | 71.0 | 118.0 | 0.6017 | 0.5847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 8.0 | 8 | 3.1286 | 0.0062 | 2572.7573 | 1783.2995 | 386.0 | 570.0 | 0.6772 | 365.0 | 0.6404 | 94.0 | 104.0 | 158.0 | 0.6582 | 0.5949 | 104.0 | 107.0 | 152.0 | 0.7039 | 0.6842 | 99.0 | 104.0 | 142.0 | 0.7324 | 0.6972 | 68.0 | 71.0 | 118.0 | 0.6017 | 0.5763 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 9 | 3.4176 | 0.0062 | 2810.4409 | 1948.0492 | 383.0 | 570.0 | 0.6719 | 357.0 | 0.6263 | 86.0 | 101.0 | 158.0 | 0.6392 | 0.5443 | 104.0 | 107.0 | 152.0 | 0.7039 | 0.6842 | 98.0 | 103.0 | 142.0 | 0.7254 | 0.6901 | 69.0 | 72.0 | 118.0 | 0.6102 | 0.5847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 10 | 3.5967 | 0.0062 | 2957.6771 | 2050.1055 | 386.0 | 570.0 | 0.6772 | 356.0 | 0.6246 | 82.0 | 101.0 | 158.0 | 0.6392 | 0.5190 | 107.0 | 109.0 | 152.0 | 0.7171 | 0.7039 | 100.0 | 105.0 | 142.0 | 0.7394 | 0.7042 | 67.0 | 71.0 | 118.0 | 0.6017 | 0.5678 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 11 | 3.7561 | 0.0062 | 3088.7869 | 2140.9839 | 380.0 | 570.0 | 0.6667 | 350.0 | 0.6140 | 79.0 | 98.0 | 158.0 | 0.6203 | 0.5 | 106.0 | 108.0 | 152.0 | 0.7105 | 0.6974 | 99.0 | 104.0 | 142.0 | 0.7324 | 0.6972 | 66.0 | 70.0 | 118.0 | 0.5932 | 0.5593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 12 | 3.8571 | 0.0062 | 3171.8705 | 2198.5731 | 379.0 | 570.0 | 0.6649 | 347.0 | 0.6088 | 76.0 | 97.0 | 158.0 | 0.6139 | 0.4810 | 106.0 | 108.0 | 152.0 | 0.7105 | 0.6974 | 99.0 | 105.0 | 142.0 | 0.7394 | 0.6972 | 66.0 | 69.0 | 118.0 | 0.5847 | 0.5593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 13 | 3.9345 | 0.0062 | 3235.4696 | 2242.6566 | 378.0 | 570.0 | 0.6632 | 348.0 | 0.6105 | 77.0 | 95.0 | 158.0 | 0.6013 | 0.4873 | 106.0 | 109.0 | 152.0 | 0.7171 | 0.6974 | 99.0 | 105.0 | 142.0 | 0.7394 | 0.6972 | 66.0 | 69.0 | 118.0 | 0.5847 | 0.5593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 14 | 3.9977 | 0.0062 | 3287.4322 | 2278.6744 | 378.0 | 570.0 | 0.6632 | 345.0 | 0.6053 | 75.0 | 96.0 | 158.0 | 0.6076 | 0.4747 | 106.0 | 108.0 | 152.0 | 0.7105 | 0.6974 | 99.0 | 105.0 | 142.0 | 0.7394 | 0.6972 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 15 | 4.0354 | 0.0062 | 3318.4791 | 2300.1944 | 379.0 | 570.0 | 0.6649 | 345.0 | 0.6053 | 76.0 | 96.0 | 158.0 | 0.6076 | 0.4810 | 107.0 | 109.0 | 152.0 | 0.7171 | 0.7039 | 97.0 | 105.0 | 142.0 | 0.7394 | 0.6831 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 16 | 4.0486 | 0.0062 | 3329.3097 | 2307.7017 | 375.0 | 570.0 | 0.6579 | 339.0 | 0.5947 | 72.0 | 96.0 | 158.0 | 0.6076 | 0.4557 | 107.0 | 109.0 | 152.0 | 0.7171 | 0.7039 | 95.0 | 101.0 | 142.0 | 0.7113 | 0.6690 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 17 | 4.1223 | 0.0062 | 3389.9024 | 2349.7013 | 376.0 | 570.0 | 0.6596 | 340.0 | 0.5965 | 72.0 | 96.0 | 158.0 | 0.6076 | 0.4557 | 106.0 | 109.0 | 152.0 | 0.7171 | 0.6974 | 96.0 | 102.0 | 142.0 | 0.7183 | 0.6761 | 66.0 | 69.0 | 118.0 | 0.5847 | 0.5593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 18 | 4.0992 | 0.0062 | 3370.9264 | 2336.5481 | 375.0 | 570.0 | 0.6579 | 338.0 | 0.5930 | 72.0 | 96.0 | 158.0 | 0.6076 | 0.4557 | 105.0 | 108.0 | 152.0 | 0.7105 | 0.6908 | 97.0 | 103.0 | 142.0 | 0.7254 | 0.6831 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 19 | 4.1257 | 0.0062 | 3392.7407 | 2351.6686 | 378.0 | 570.0 | 0.6632 | 340.0 | 0.5965 | 71.0 | 95.0 | 158.0 | 0.6013 | 0.4494 | 107.0 | 109.0 | 152.0 | 0.7171 | 0.7039 | 97.0 | 105.0 | 142.0 | 0.7394 | 0.6831 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 20 | 4.1234 | 0.0062 | 3390.8098 | 2350.3302 | 378.0 | 570.0 | 0.6632 | 339.0 | 0.5947 | 70.0 | 96.0 | 158.0 | 0.6076 | 0.4430 | 107.0 | 109.0 | 152.0 | 0.7171 | 0.7039 | 97.0 | 104.0 | 142.0 | 0.7324 | 0.6831 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 21 | 4.1404 | 0.0062 | 3404.8216 | 2360.0425 | 376.0 | 570.0 | 0.6596 | 338.0 | 0.5930 | 71.0 | 96.0 | 158.0 | 0.6076 | 0.4494 | 107.0 | 109.0 | 152.0 | 0.7171 | 0.7039 | 96.0 | 103.0 | 142.0 | 0.7254 | 0.6761 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 22 | 4.1645 | 0.0062 | 3424.6124 | 2373.7604 | 376.0 | 570.0 | 0.6596 | 339.0 | 0.5947 | 70.0 | 95.0 | 158.0 | 0.6013 | 0.4430 | 108.0 | 109.0 | 152.0 | 0.7171 | 0.7105 | 96.0 | 103.0 | 142.0 | 0.7254 | 0.6761 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 23 | 4.1592 | 0.0062 | 3420.2355 | 2370.7266 | 379.0 | 570.0 | 0.6649 | 341.0 | 0.5982 | 71.0 | 96.0 | 158.0 | 0.6076 | 0.4494 | 106.0 | 109.0 | 152.0 | 0.7171 | 0.6974 | 98.0 | 105.0 | 142.0 | 0.7394 | 0.6901 | 66.0 | 69.0 | 118.0 | 0.5847 | 0.5593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 24 | 4.1565 | 0.0062 | 3418.0247 | 2369.1942 | 378.0 | 570.0 | 0.6632 | 340.0 | 0.5965 | 70.0 | 96.0 | 158.0 | 0.6076 | 0.4430 | 107.0 | 109.0 | 152.0 | 0.7171 | 0.7039 | 99.0 | 105.0 | 142.0 | 0.7394 | 0.6972 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 25 | 4.1931 | 0.0062 | 3448.1315 | 2390.0626 | 376.0 | 570.0 | 0.6596 | 341.0 | 0.5982 | 70.0 | 95.0 | 158.0 | 0.6013 | 0.4430 | 107.0 | 108.0 | 152.0 | 0.7105 | 0.7039 | 99.0 | 104.0 | 142.0 | 0.7324 | 0.6972 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 26 | 4.1936 | 0.0062 | 3448.5798 | 2390.3734 | 372.0 | 570.0 | 0.6526 | 336.0 | 0.5895 | 71.0 | 95.0 | 158.0 | 0.6013 | 0.4494 | 105.0 | 108.0 | 152.0 | 0.7105 | 0.6908 | 96.0 | 101.0 | 142.0 | 0.7113 | 0.6761 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 27 | 4.1744 | 0.0062 | 3432.7320 | 2379.3885 | 376.0 | 570.0 | 0.6596 | 338.0 | 0.5930 | 70.0 | 96.0 | 158.0 | 0.6076 | 0.4430 | 106.0 | 108.0 | 152.0 | 0.7105 | 0.6974 | 98.0 | 104.0 | 142.0 | 0.7324 | 0.6901 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 28 | 4.1920 | 0.0062 | 3447.2556 | 2389.4555 | 378.0 | 570.0 | 0.6632 | 341.0 | 0.5982 | 71.0 | 96.0 | 158.0 | 0.6076 | 0.4494 | 107.0 | 109.0 | 152.0 | 0.7171 | 0.7039 | 98.0 | 104.0 | 142.0 | 0.7324 | 0.6901 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 29 | 4.1905 | 0.0062 | 3446.0259 | 2388.6031 | 376.0 | 570.0 | 0.6596 | 339.0 | 0.5947 | 70.0 | 96.0 | 158.0 | 0.6076 | 0.4430 | 106.0 | 108.0 | 152.0 | 0.7105 | 0.6974 | 99.0 | 105.0 | 142.0 | 0.7394 | 0.6972 | 64.0 | 67.0 | 118.0 | 0.5678 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 30 | 4.1922 | 0.0062 | 3447.3786 | 2389.5408 | 377.0 | 570.0 | 0.6614 | 339.0 | 0.5947 | 70.0 | 95.0 | 158.0 | 0.6013 | 0.4430 | 106.0 | 109.0 | 152.0 | 0.7171 | 0.6974 | 99.0 | 104.0 | 142.0 | 0.7324 | 0.6972 | 64.0 | 69.0 | 118.0 | 0.5847 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 31 | 4.1992 | 0.0062 | 3453.1486 | 2393.5402 | 378.0 | 570.0 | 0.6632 | 339.0 | 0.5947 | 70.0 | 96.0 | 158.0 | 0.6076 | 0.4430 | 106.0 | 108.0 | 152.0 | 0.7105 | 0.6974 | 98.0 | 105.0 | 142.0 | 0.7394 | 0.6901 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 32 | 4.1634 | 0.0062 | 3423.6761 | 2373.1115 | 373.0 | 570.0 | 0.6544 | 338.0 | 0.5930 | 69.0 | 94.0 | 158.0 | 0.5949 | 0.4367 | 107.0 | 108.0 | 152.0 | 0.7105 | 0.7039 | 98.0 | 103.0 | 142.0 | 0.7254 | 0.6901 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 33 | 4.1932 | 0.0062 | 3448.1967 | 2390.1078 | 376.0 | 570.0 | 0.6596 | 338.0 | 0.5930 | 70.0 | 95.0 | 158.0 | 0.6013 | 0.4430 | 106.0 | 109.0 | 152.0 | 0.7171 | 0.6974 | 97.0 | 103.0 | 142.0 | 0.7254 | 0.6831 | 65.0 | 69.0 | 118.0 | 0.5847 | 0.5508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 34.0 | 34 | 4.2044 | 0.0062 | 3457.3908 | 2396.4807 | 376.0 | 570.0 | 0.6596 | 340.0 | 0.5965 | 70.0 | 95.0 | 158.0 | 0.6013 | 0.4430 | 107.0 | 108.0 | 152.0 | 0.7105 | 0.7039 | 99.0 | 105.0 | 142.0 | 0.7394 | 0.6972 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 35.0 | 35 | 4.1960 | 0.0062 | 3450.5312 | 2391.7260 | 376.0 | 570.0 | 0.6596 | 340.0 | 0.5965 | 71.0 | 95.0 | 158.0 | 0.6013 | 0.4494 | 106.0 | 109.0 | 152.0 | 0.7171 | 0.6974 | 99.0 | 104.0 | 142.0 | 0.7324 | 0.6972 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 36.0 | 36 | 4.2146 | 0.0062 | 3465.8272 | 2402.3283 | 376.0 | 570.0 | 0.6596 | 338.0 | 0.5930 | 70.0 | 94.0 | 158.0 | 0.5949 | 0.4430 | 106.0 | 109.0 | 152.0 | 0.7171 | 0.6974 | 98.0 | 105.0 | 142.0 | 0.7394 | 0.6901 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 37.0 | 37 | 4.2056 | 0.0062 | 3458.4119 | 2397.1884 | 374.0 | 570.0 | 0.6561 | 334.0 | 0.5860 | 69.0 | 94.0 | 158.0 | 0.5949 | 0.4367 | 105.0 | 108.0 | 152.0 | 0.7105 | 0.6908 | 96.0 | 104.0 | 142.0 | 0.7324 | 0.6761 | 64.0 | 68.0 | 118.0 | 0.5763 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Easy_Llama-3.2-1B-yn0mux6w
Base model
meta-llama/Llama-3.2-1B