ARC-Easy_Llama-3.2-1B-arerciw5
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.6297
- Model Preparation Time: 0.0069
- Mdl: 2984.8081
- Accumulated Loss: 2068.9113
- Correct Preds: 381.0
- Total Preds: 570.0
- Accuracy: 0.6684
- Correct Gen Preds: 368.0
- Gen Accuracy: 0.6456
- Correct Gen Preds 32: 108.0
- Correct Preds 32: 117.0
- Total Labels 32: 158.0
- Accuracy 32: 0.7405
- Gen Accuracy 32: 0.6835
- Correct Gen Preds 33: 116.0
- Correct Preds 33: 116.0
- Total Labels 33: 152.0
- Accuracy 33: 0.7632
- Gen Accuracy 33: 0.7632
- Correct Gen Preds 34: 67.0
- Correct Preds 34: 70.0
- Total Labels 34: 142.0
- Accuracy 34: 0.4930
- Gen Accuracy 34: 0.4718
- Correct Gen Preds 35: 77.0
- Correct Preds 35: 78.0
- Total Labels 35: 118.0
- Accuracy 35: 0.6610
- Gen Accuracy 35: 0.6525
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.0069 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3337 | 1.0 | 2 | 1.4204 | 0.0069 | 1168.0072 | 809.6009 | 243.0 | 570.0 | 0.4263 | 243.0 | 0.4263 | 0.0 | 0.0 | 158.0 | 0.0 | 0.0 | 91.0 | 91.0 | 152.0 | 0.5987 | 0.5987 | 134.0 | 134.0 | 142.0 | 0.9437 | 0.9437 | 18.0 | 18.0 | 118.0 | 0.1525 | 0.1525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.9513 | 2.0 | 4 | 1.5711 | 0.0069 | 1291.9860 | 895.5365 | 297.0 | 570.0 | 0.5211 | 297.0 | 0.5211 | 149.0 | 149.0 | 158.0 | 0.9430 | 0.9430 | 72.0 | 72.0 | 152.0 | 0.4737 | 0.4737 | 47.0 | 47.0 | 142.0 | 0.3310 | 0.3310 | 29.0 | 29.0 | 118.0 | 0.2458 | 0.2458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.5156 | 3.0 | 6 | 1.1942 | 0.0069 | 982.0490 | 680.7045 | 341.0 | 570.0 | 0.5982 | 341.0 | 0.5982 | 45.0 | 45.0 | 158.0 | 0.2848 | 0.2848 | 101.0 | 101.0 | 152.0 | 0.6645 | 0.6645 | 107.0 | 107.0 | 142.0 | 0.7535 | 0.7535 | 88.0 | 88.0 | 118.0 | 0.7458 | 0.7458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0236 | 4.0 | 8 | 1.9337 | 0.0069 | 1590.1228 | 1102.1891 | 377.0 | 570.0 | 0.6614 | 377.0 | 0.6614 | 93.0 | 93.0 | 158.0 | 0.5886 | 0.5886 | 100.0 | 100.0 | 152.0 | 0.6579 | 0.6579 | 100.0 | 100.0 | 142.0 | 0.7042 | 0.7042 | 84.0 | 84.0 | 118.0 | 0.7119 | 0.7119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.098 | 5.0 | 10 | 2.4705 | 0.0069 | 2031.5785 | 1408.1829 | 364.0 | 570.0 | 0.6386 | 363.0 | 0.6368 | 124.0 | 125.0 | 158.0 | 0.7911 | 0.7848 | 83.0 | 83.0 | 152.0 | 0.5461 | 0.5461 | 77.0 | 77.0 | 142.0 | 0.5423 | 0.5423 | 79.0 | 79.0 | 118.0 | 0.6695 | 0.6695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0699 | 6.0 | 12 | 3.2267 | 0.0069 | 2653.4710 | 1839.2459 | 375.0 | 570.0 | 0.6579 | 372.0 | 0.6526 | 114.0 | 117.0 | 158.0 | 0.7405 | 0.7215 | 103.0 | 103.0 | 152.0 | 0.6776 | 0.6776 | 78.0 | 78.0 | 142.0 | 0.5493 | 0.5493 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 7.0 | 14 | 3.6297 | 0.0069 | 2984.8081 | 2068.9113 | 381.0 | 570.0 | 0.6684 | 368.0 | 0.6456 | 108.0 | 117.0 | 158.0 | 0.7405 | 0.6835 | 116.0 | 116.0 | 152.0 | 0.7632 | 0.7632 | 67.0 | 70.0 | 142.0 | 0.4930 | 0.4718 | 77.0 | 78.0 | 118.0 | 0.6610 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 8.0 | 16 | 3.8663 | 0.0069 | 3179.3653 | 2203.7681 | 374.0 | 570.0 | 0.6561 | 341.0 | 0.5982 | 85.0 | 110.0 | 158.0 | 0.6962 | 0.5380 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 64.0 | 70.0 | 142.0 | 0.4930 | 0.4507 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 18 | 4.0266 | 0.0069 | 3311.2517 | 2295.1848 | 368.0 | 570.0 | 0.6456 | 327.0 | 0.5737 | 73.0 | 105.0 | 158.0 | 0.6646 | 0.4620 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 61.0 | 68.0 | 142.0 | 0.4789 | 0.4296 | 76.0 | 77.0 | 118.0 | 0.6525 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 20 | 4.1711 | 0.0069 | 3430.0548 | 2377.5328 | 368.0 | 570.0 | 0.6456 | 326.0 | 0.5719 | 72.0 | 103.0 | 158.0 | 0.6519 | 0.4557 | 117.0 | 119.0 | 152.0 | 0.7829 | 0.7697 | 61.0 | 69.0 | 142.0 | 0.4859 | 0.4296 | 76.0 | 77.0 | 118.0 | 0.6525 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 22 | 4.2348 | 0.0069 | 3482.4145 | 2413.8258 | 367.0 | 570.0 | 0.6439 | 327.0 | 0.5737 | 71.0 | 101.0 | 158.0 | 0.6392 | 0.4494 | 118.0 | 119.0 | 152.0 | 0.7829 | 0.7763 | 63.0 | 71.0 | 142.0 | 0.5 | 0.4437 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 24 | 4.3235 | 0.0069 | 3555.3314 | 2464.3680 | 366.0 | 570.0 | 0.6421 | 326.0 | 0.5719 | 73.0 | 104.0 | 158.0 | 0.6582 | 0.4620 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 61.0 | 68.0 | 142.0 | 0.4789 | 0.4296 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 26 | 4.3461 | 0.0069 | 3573.9396 | 2477.2662 | 370.0 | 570.0 | 0.6491 | 330.0 | 0.5789 | 74.0 | 105.0 | 158.0 | 0.6646 | 0.4684 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 64.0 | 71.0 | 142.0 | 0.5 | 0.4507 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 28 | 4.3956 | 0.0069 | 3614.6647 | 2505.4946 | 372.0 | 570.0 | 0.6526 | 331.0 | 0.5807 | 74.0 | 105.0 | 158.0 | 0.6646 | 0.4684 | 118.0 | 119.0 | 152.0 | 0.7829 | 0.7763 | 63.0 | 71.0 | 142.0 | 0.5 | 0.4437 | 76.0 | 77.0 | 118.0 | 0.6525 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 30 | 4.4516 | 0.0069 | 3660.7412 | 2537.4324 | 366.0 | 570.0 | 0.6421 | 327.0 | 0.5737 | 72.0 | 102.0 | 158.0 | 0.6456 | 0.4557 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 64.0 | 71.0 | 142.0 | 0.5 | 0.4507 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 32 | 4.4475 | 0.0069 | 3657.3704 | 2535.0960 | 369.0 | 570.0 | 0.6474 | 330.0 | 0.5789 | 74.0 | 104.0 | 158.0 | 0.6582 | 0.4684 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 64.0 | 71.0 | 142.0 | 0.5 | 0.4507 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 34 | 4.4649 | 0.0069 | 3671.6638 | 2545.0034 | 369.0 | 570.0 | 0.6474 | 328.0 | 0.5754 | 73.0 | 104.0 | 158.0 | 0.6582 | 0.4620 | 116.0 | 118.0 | 152.0 | 0.7763 | 0.7632 | 64.0 | 71.0 | 142.0 | 0.5 | 0.4507 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 36 | 4.4897 | 0.0069 | 3692.0343 | 2559.1232 | 367.0 | 570.0 | 0.6439 | 328.0 | 0.5754 | 73.0 | 103.0 | 158.0 | 0.6519 | 0.4620 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 63.0 | 70.0 | 142.0 | 0.4930 | 0.4437 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 38 | 4.4832 | 0.0069 | 3686.6707 | 2555.4054 | 370.0 | 570.0 | 0.6491 | 331.0 | 0.5807 | 75.0 | 105.0 | 158.0 | 0.6646 | 0.4747 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 64.0 | 71.0 | 142.0 | 0.5 | 0.4507 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 40 | 4.4917 | 0.0069 | 3693.7120 | 2560.2860 | 370.0 | 570.0 | 0.6491 | 329.0 | 0.5772 | 73.0 | 105.0 | 158.0 | 0.6646 | 0.4620 | 118.0 | 119.0 | 152.0 | 0.7829 | 0.7763 | 63.0 | 70.0 | 142.0 | 0.4930 | 0.4437 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 42 | 4.4948 | 0.0069 | 3696.2223 | 2562.0261 | 370.0 | 570.0 | 0.6491 | 330.0 | 0.5789 | 73.0 | 104.0 | 158.0 | 0.6582 | 0.4620 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 64.0 | 71.0 | 142.0 | 0.5 | 0.4507 | 76.0 | 77.0 | 118.0 | 0.6525 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 44 | 4.5041 | 0.0069 | 3703.9242 | 2567.3646 | 371.0 | 570.0 | 0.6509 | 332.0 | 0.5825 | 75.0 | 105.0 | 158.0 | 0.6646 | 0.4747 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 65.0 | 72.0 | 142.0 | 0.5070 | 0.4577 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 46 | 4.5023 | 0.0069 | 3702.4334 | 2566.3313 | 368.0 | 570.0 | 0.6456 | 331.0 | 0.5807 | 76.0 | 104.0 | 158.0 | 0.6582 | 0.4810 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 63.0 | 70.0 | 142.0 | 0.4930 | 0.4437 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 48 | 4.5165 | 0.0069 | 3714.0749 | 2574.4005 | 368.0 | 570.0 | 0.6456 | 331.0 | 0.5807 | 75.0 | 104.0 | 158.0 | 0.6582 | 0.4747 | 118.0 | 118.0 | 152.0 | 0.7763 | 0.7763 | 63.0 | 70.0 | 142.0 | 0.4930 | 0.4437 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 50 | 4.5122 | 0.0069 | 3710.5434 | 2571.9527 | 369.0 | 570.0 | 0.6474 | 329.0 | 0.5772 | 74.0 | 104.0 | 158.0 | 0.6582 | 0.4684 | 116.0 | 118.0 | 152.0 | 0.7763 | 0.7632 | 65.0 | 72.0 | 142.0 | 0.5070 | 0.4577 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 52 | 4.5475 | 0.0069 | 3739.5911 | 2592.0870 | 368.0 | 570.0 | 0.6456 | 329.0 | 0.5772 | 73.0 | 103.0 | 158.0 | 0.6519 | 0.4620 | 118.0 | 119.0 | 152.0 | 0.7829 | 0.7763 | 64.0 | 71.0 | 142.0 | 0.5 | 0.4507 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 54 | 4.5286 | 0.0069 | 3724.0209 | 2581.2946 | 368.0 | 570.0 | 0.6456 | 329.0 | 0.5772 | 74.0 | 104.0 | 158.0 | 0.6582 | 0.4684 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 63.0 | 70.0 | 142.0 | 0.4930 | 0.4437 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 56 | 4.5322 | 0.0069 | 3726.9841 | 2583.3485 | 368.0 | 570.0 | 0.6456 | 331.0 | 0.5807 | 75.0 | 103.0 | 158.0 | 0.6519 | 0.4747 | 116.0 | 117.0 | 152.0 | 0.7697 | 0.7632 | 66.0 | 73.0 | 142.0 | 0.5141 | 0.4648 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 58 | 4.5429 | 0.0069 | 3735.7961 | 2589.4565 | 368.0 | 570.0 | 0.6456 | 329.0 | 0.5772 | 74.0 | 104.0 | 158.0 | 0.6582 | 0.4684 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 63.0 | 70.0 | 142.0 | 0.4930 | 0.4437 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 60 | 4.5244 | 0.0069 | 3720.6018 | 2578.9247 | 367.0 | 570.0 | 0.6439 | 329.0 | 0.5772 | 74.0 | 103.0 | 158.0 | 0.6519 | 0.4684 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 63.0 | 70.0 | 142.0 | 0.4930 | 0.4437 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 62 | 4.5184 | 0.0069 | 3715.6023 | 2575.4593 | 369.0 | 570.0 | 0.6474 | 330.0 | 0.5789 | 74.0 | 104.0 | 158.0 | 0.6582 | 0.4684 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 64.0 | 71.0 | 142.0 | 0.5 | 0.4507 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 64 | 4.5386 | 0.0069 | 3732.2854 | 2587.0231 | 370.0 | 570.0 | 0.6491 | 332.0 | 0.5825 | 75.0 | 105.0 | 158.0 | 0.6646 | 0.4747 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 65.0 | 71.0 | 142.0 | 0.5 | 0.4577 | 75.0 | 76.0 | 118.0 | 0.6441 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 66 | 4.5417 | 0.0069 | 3734.7736 | 2588.7478 | 369.0 | 570.0 | 0.6474 | 330.0 | 0.5789 | 74.0 | 104.0 | 158.0 | 0.6582 | 0.4684 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 65.0 | 72.0 | 142.0 | 0.5070 | 0.4577 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 34.0 | 68 | 4.5473 | 0.0069 | 3739.4472 | 2591.9873 | 371.0 | 570.0 | 0.6509 | 330.0 | 0.5789 | 72.0 | 104.0 | 158.0 | 0.6582 | 0.4557 | 118.0 | 119.0 | 152.0 | 0.7829 | 0.7763 | 64.0 | 71.0 | 142.0 | 0.5 | 0.4507 | 76.0 | 77.0 | 118.0 | 0.6525 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 35.0 | 70 | 4.5106 | 0.0069 | 3709.2286 | 2571.0413 | 368.0 | 570.0 | 0.6456 | 330.0 | 0.5789 | 74.0 | 103.0 | 158.0 | 0.6519 | 0.4684 | 117.0 | 118.0 | 152.0 | 0.7763 | 0.7697 | 65.0 | 72.0 | 142.0 | 0.5070 | 0.4577 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 36.0 | 72 | 4.5517 | 0.0069 | 3743.0479 | 2594.4831 | 365.0 | 570.0 | 0.6404 | 327.0 | 0.5737 | 74.0 | 103.0 | 158.0 | 0.6519 | 0.4684 | 116.0 | 117.0 | 152.0 | 0.7697 | 0.7632 | 63.0 | 70.0 | 142.0 | 0.4930 | 0.4437 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 37.0 | 74 | 4.5277 | 0.0069 | 3723.2808 | 2580.7816 | 370.0 | 570.0 | 0.6491 | 331.0 | 0.5807 | 74.0 | 103.0 | 158.0 | 0.6519 | 0.4684 | 116.0 | 118.0 | 152.0 | 0.7763 | 0.7632 | 65.0 | 72.0 | 142.0 | 0.5070 | 0.4577 | 76.0 | 77.0 | 118.0 | 0.6525 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 3
Model tree for donoway/ARC-Easy_Llama-3.2-1B-arerciw5
Base model
meta-llama/Llama-3.2-1B