ARC-Easy_Llama-3.2-1B-jlf8qw0w
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.6261
- Model Preparation Time: 0.0045
- Mdl: 2159.4969
- Accumulated Loss: 1496.8492
- Correct Preds: 372.0
- Total Preds: 570.0
- Accuracy: 0.6526
- Correct Gen Preds: 372.0
- Gen Accuracy: 0.6526
- Correct Gen Preds 32: 134.0
- Correct Preds 32: 134.0
- Total Labels 32: 158.0
- Accuracy 32: 0.8481
- Gen Accuracy 32: 0.8481
- Correct Gen Preds 33: 95.0
- Correct Preds 33: 95.0
- Total Labels 33: 152.0
- Accuracy 33: 0.625
- Gen Accuracy 33: 0.625
- Correct Gen Preds 34: 88.0
- Correct Preds 34: 88.0
- Total Labels 34: 142.0
- Accuracy 34: 0.6197
- Gen Accuracy 34: 0.6197
- Correct Gen Preds 35: 55.0
- Correct Preds 35: 55.0
- Total Labels 35: 118.0
- Accuracy 35: 0.4661
- Gen Accuracy 35: 0.4661
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.0045 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.435 | 1.0 | 1 | 1.5354 | 0.0045 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.435 | 2.0 | 2 | 2.1175 | 0.0045 | 1741.3294 | 1206.9975 | 184.0 | 570.0 | 0.3228 | 184.0 | 0.3228 | 0.0 | 0.0 | 158.0 | 0.0 | 0.0 | 151.0 | 151.0 | 152.0 | 0.9934 | 0.9934 | 33.0 | 33.0 | 142.0 | 0.2324 | 0.2324 | 0.0 | 0.0 | 118.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0061 | 3.0 | 3 | 1.2944 | 0.0045 | 1064.4094 | 737.7924 | 208.0 | 570.0 | 0.3649 | 208.0 | 0.3649 | 55.0 | 55.0 | 158.0 | 0.3481 | 0.3481 | 145.0 | 145.0 | 152.0 | 0.9539 | 0.9539 | 8.0 | 8.0 | 142.0 | 0.0563 | 0.0563 | 0.0 | 0.0 | 118.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.9135 | 4.0 | 4 | 2.0963 | 0.0045 | 1723.8711 | 1194.8964 | 203.0 | 570.0 | 0.3561 | 203.0 | 0.3561 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 3.0 | 3.0 | 152.0 | 0.0197 | 0.0197 | 35.0 | 35.0 | 142.0 | 0.2465 | 0.2465 | 11.0 | 11.0 | 118.0 | 0.0932 | 0.0932 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.5128 | 5.0 | 5 | 1.6210 | 0.0045 | 1333.0135 | 923.9746 | 319.0 | 570.0 | 0.5596 | 319.0 | 0.5596 | 141.0 | 141.0 | 158.0 | 0.8924 | 0.8924 | 60.0 | 60.0 | 152.0 | 0.3947 | 0.3947 | 80.0 | 80.0 | 142.0 | 0.5634 | 0.5634 | 38.0 | 38.0 | 118.0 | 0.3220 | 0.3220 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.1014 | 6.0 | 6 | 1.6826 | 0.0045 | 1383.6259 | 959.0564 | 368.0 | 570.0 | 0.6456 | 367.0 | 0.6439 | 106.0 | 107.0 | 158.0 | 0.6772 | 0.6709 | 113.0 | 113.0 | 152.0 | 0.7434 | 0.7434 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 55.0 | 55.0 | 118.0 | 0.4661 | 0.4661 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0091 | 7.0 | 7 | 2.6261 | 0.0045 | 2159.4969 | 1496.8492 | 372.0 | 570.0 | 0.6526 | 372.0 | 0.6526 | 134.0 | 134.0 | 158.0 | 0.8481 | 0.8481 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 88.0 | 88.0 | 142.0 | 0.6197 | 0.6197 | 55.0 | 55.0 | 118.0 | 0.4661 | 0.4661 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 8.0 | 8 | 3.7800 | 0.0045 | 3108.3964 | 2154.5762 | 358.0 | 570.0 | 0.6281 | 358.0 | 0.6281 | 141.0 | 141.0 | 158.0 | 0.8924 | 0.8924 | 82.0 | 82.0 | 152.0 | 0.5395 | 0.5395 | 80.0 | 80.0 | 142.0 | 0.5634 | 0.5634 | 55.0 | 55.0 | 118.0 | 0.4661 | 0.4661 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 9 | 4.7384 | 0.0045 | 3896.5734 | 2700.8989 | 354.0 | 570.0 | 0.6211 | 354.0 | 0.6211 | 146.0 | 146.0 | 158.0 | 0.9241 | 0.9241 | 79.0 | 79.0 | 152.0 | 0.5197 | 0.5197 | 76.0 | 76.0 | 142.0 | 0.5352 | 0.5352 | 53.0 | 53.0 | 118.0 | 0.4492 | 0.4492 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 10 | 5.4116 | 0.0045 | 4450.1384 | 3084.6009 | 348.0 | 570.0 | 0.6105 | 348.0 | 0.6105 | 145.0 | 145.0 | 158.0 | 0.9177 | 0.9177 | 75.0 | 75.0 | 152.0 | 0.4934 | 0.4934 | 75.0 | 75.0 | 142.0 | 0.5282 | 0.5282 | 53.0 | 53.0 | 118.0 | 0.4492 | 0.4492 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 11 | 5.9142 | 0.0045 | 4863.4871 | 3371.1124 | 344.0 | 570.0 | 0.6035 | 344.0 | 0.6035 | 147.0 | 147.0 | 158.0 | 0.9304 | 0.9304 | 72.0 | 72.0 | 152.0 | 0.4737 | 0.4737 | 73.0 | 73.0 | 142.0 | 0.5141 | 0.5141 | 52.0 | 52.0 | 118.0 | 0.4407 | 0.4407 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 12 | 6.3347 | 0.0045 | 5209.2659 | 3610.7880 | 336.0 | 570.0 | 0.5895 | 336.0 | 0.5895 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 68.0 | 68.0 | 152.0 | 0.4474 | 0.4474 | 70.0 | 70.0 | 142.0 | 0.4930 | 0.4930 | 49.0 | 49.0 | 118.0 | 0.4153 | 0.4153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 13 | 6.5582 | 0.0045 | 5393.0187 | 3738.1557 | 334.0 | 570.0 | 0.5860 | 334.0 | 0.5860 | 148.0 | 148.0 | 158.0 | 0.9367 | 0.9367 | 68.0 | 68.0 | 152.0 | 0.4474 | 0.4474 | 70.0 | 70.0 | 142.0 | 0.4930 | 0.4930 | 48.0 | 48.0 | 118.0 | 0.4068 | 0.4068 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 14 | 6.7692 | 0.0045 | 5566.5687 | 3858.4514 | 329.0 | 570.0 | 0.5772 | 329.0 | 0.5772 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 65.0 | 65.0 | 152.0 | 0.4276 | 0.4276 | 67.0 | 67.0 | 142.0 | 0.4718 | 0.4718 | 48.0 | 48.0 | 118.0 | 0.4068 | 0.4068 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 15 | 6.9534 | 0.0045 | 5718.0129 | 3963.4245 | 323.0 | 570.0 | 0.5667 | 323.0 | 0.5667 | 148.0 | 148.0 | 158.0 | 0.9367 | 0.9367 | 63.0 | 63.0 | 152.0 | 0.4145 | 0.4145 | 65.0 | 65.0 | 142.0 | 0.4577 | 0.4577 | 47.0 | 47.0 | 118.0 | 0.3983 | 0.3983 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 16 | 7.1324 | 0.0045 | 5865.2522 | 4065.4830 | 324.0 | 570.0 | 0.5684 | 324.0 | 0.5684 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 63.0 | 63.0 | 152.0 | 0.4145 | 0.4145 | 66.0 | 66.0 | 142.0 | 0.4648 | 0.4648 | 46.0 | 46.0 | 118.0 | 0.3898 | 0.3898 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 17 | 7.2239 | 0.0045 | 5940.4461 | 4117.6035 | 323.0 | 570.0 | 0.5667 | 323.0 | 0.5667 | 150.0 | 150.0 | 158.0 | 0.9494 | 0.9494 | 63.0 | 63.0 | 152.0 | 0.4145 | 0.4145 | 65.0 | 65.0 | 142.0 | 0.4577 | 0.4577 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 18 | 7.3212 | 0.0045 | 6020.4799 | 4173.0787 | 322.0 | 570.0 | 0.5649 | 322.0 | 0.5649 | 150.0 | 150.0 | 158.0 | 0.9494 | 0.9494 | 63.0 | 63.0 | 152.0 | 0.4145 | 0.4145 | 64.0 | 64.0 | 142.0 | 0.4507 | 0.4507 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 19 | 7.3806 | 0.0045 | 6069.3113 | 4206.9260 | 323.0 | 570.0 | 0.5667 | 323.0 | 0.5667 | 150.0 | 150.0 | 158.0 | 0.9494 | 0.9494 | 63.0 | 63.0 | 152.0 | 0.4145 | 0.4145 | 65.0 | 65.0 | 142.0 | 0.4577 | 0.4577 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 20 | 7.4461 | 0.0045 | 6123.1916 | 4244.2730 | 319.0 | 570.0 | 0.5596 | 319.0 | 0.5596 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 64.0 | 64.0 | 142.0 | 0.4507 | 0.4507 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 21 | 7.4628 | 0.0045 | 6136.9611 | 4253.8173 | 320.0 | 570.0 | 0.5614 | 320.0 | 0.5614 | 150.0 | 150.0 | 158.0 | 0.9494 | 0.9494 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 64.0 | 64.0 | 142.0 | 0.4507 | 0.4507 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 22 | 7.5301 | 0.0045 | 6192.2567 | 4292.1453 | 319.0 | 570.0 | 0.5596 | 319.0 | 0.5596 | 150.0 | 150.0 | 158.0 | 0.9494 | 0.9494 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 64.0 | 64.0 | 142.0 | 0.4507 | 0.4507 | 44.0 | 44.0 | 118.0 | 0.3729 | 0.3729 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 23 | 7.5338 | 0.0045 | 6195.2843 | 4294.2438 | 319.0 | 570.0 | 0.5596 | 319.0 | 0.5596 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 64.0 | 64.0 | 142.0 | 0.4507 | 0.4507 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 24 | 7.5580 | 0.0045 | 6215.2572 | 4308.0880 | 320.0 | 570.0 | 0.5614 | 320.0 | 0.5614 | 150.0 | 150.0 | 158.0 | 0.9494 | 0.9494 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 64.0 | 64.0 | 142.0 | 0.4507 | 0.4507 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 25 | 7.5289 | 0.0045 | 6191.2736 | 4291.4638 | 319.0 | 570.0 | 0.5596 | 319.0 | 0.5596 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 64.0 | 64.0 | 142.0 | 0.4507 | 0.4507 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 26 | 7.6133 | 0.0045 | 6260.6716 | 4339.5669 | 319.0 | 570.0 | 0.5596 | 319.0 | 0.5596 | 150.0 | 150.0 | 158.0 | 0.9494 | 0.9494 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 27 | 7.6015 | 0.0045 | 6250.9622 | 4332.8368 | 318.0 | 570.0 | 0.5579 | 318.0 | 0.5579 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 28 | 7.5912 | 0.0045 | 6242.5494 | 4327.0055 | 318.0 | 570.0 | 0.5579 | 318.0 | 0.5579 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 29 | 7.6034 | 0.0045 | 6252.5795 | 4333.9578 | 318.0 | 570.0 | 0.5579 | 318.0 | 0.5579 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 30 | 7.6187 | 0.0045 | 6265.1093 | 4342.6429 | 316.0 | 570.0 | 0.5544 | 316.0 | 0.5544 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 60.0 | 60.0 | 152.0 | 0.3947 | 0.3947 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 44.0 | 44.0 | 118.0 | 0.3729 | 0.3729 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 31 | 7.6249 | 0.0045 | 6270.2339 | 4346.1949 | 318.0 | 570.0 | 0.5579 | 318.0 | 0.5579 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 32 | 7.5821 | 0.0045 | 6235.0026 | 4321.7745 | 318.0 | 570.0 | 0.5579 | 318.0 | 0.5579 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 33 | 7.6294 | 0.0045 | 6273.9284 | 4348.7558 | 318.0 | 570.0 | 0.5579 | 318.0 | 0.5579 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 34.0 | 34 | 7.6133 | 0.0045 | 6260.6774 | 4339.5709 | 319.0 | 570.0 | 0.5596 | 319.0 | 0.5596 | 150.0 | 150.0 | 158.0 | 0.9494 | 0.9494 | 59.0 | 59.0 | 152.0 | 0.3882 | 0.3882 | 65.0 | 65.0 | 142.0 | 0.4577 | 0.4577 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 35.0 | 35 | 7.6097 | 0.0045 | 6257.7566 | 4337.5464 | 319.0 | 570.0 | 0.5596 | 319.0 | 0.5596 | 150.0 | 150.0 | 158.0 | 0.9494 | 0.9494 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 64.0 | 64.0 | 142.0 | 0.4507 | 0.4507 | 44.0 | 44.0 | 118.0 | 0.3729 | 0.3729 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 36.0 | 36 | 7.6376 | 0.0045 | 6280.6414 | 4353.4089 | 316.0 | 570.0 | 0.5544 | 316.0 | 0.5544 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 60.0 | 60.0 | 152.0 | 0.3947 | 0.3947 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 44.0 | 44.0 | 118.0 | 0.3729 | 0.3729 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 37.0 | 37 | 7.6398 | 0.0045 | 6282.5078 | 4354.7025 | 318.0 | 570.0 | 0.5579 | 318.0 | 0.5579 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 61.0 | 61.0 | 152.0 | 0.4013 | 0.4013 | 63.0 | 63.0 | 142.0 | 0.4437 | 0.4437 | 45.0 | 45.0 | 118.0 | 0.3814 | 0.3814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Easy_Llama-3.2-1B-jlf8qw0w
Base model
meta-llama/Llama-3.2-1B