ARC-Easy_Llama-3.2-1B-yy20ooxg

This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0916
  • Model Preparation Time: 0.0057
  • Mdl: 2542.3237
  • Accumulated Loss: 1762.2045
  • Correct Preds: 378.0
  • Total Preds: 570.0
  • Accuracy: 0.6632
  • Correct Gen Preds: 378.0
  • Gen Accuracy: 0.6632
  • Correct Gen Preds 32: 131.0
  • Correct Preds 32: 131.0
  • Total Labels 32: 158.0
  • Accuracy 32: 0.8291
  • Gen Accuracy 32: 0.8291
  • Correct Gen Preds 33: 96.0
  • Correct Preds 33: 96.0
  • Total Labels 33: 152.0
  • Accuracy 33: 0.6316
  • Gen Accuracy 33: 0.6316
  • Correct Gen Preds 34: 85.0
  • Correct Preds 34: 85.0
  • Total Labels 34: 142.0
  • Accuracy 34: 0.5986
  • Gen Accuracy 34: 0.5986
  • Correct Gen Preds 35: 66.0
  • Correct Preds 35: 66.0
  • Total Labels 35: 118.0
  • Accuracy 35: 0.5593
  • Gen Accuracy 35: 0.5593
  • Correct Gen Preds 36: 0.0
  • Correct Preds 36: 0.0
  • Total Labels 36: 0.0
  • Accuracy 36: 0.0
  • Gen Accuracy 36: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 112
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: constant
  • lr_scheduler_warmup_ratio: 0.001
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Mdl Accumulated Loss Correct Preds Total Preds Accuracy Correct Gen Preds Gen Accuracy Correct Gen Preds 32 Correct Preds 32 Total Labels 32 Accuracy 32 Gen Accuracy 32 Correct Gen Preds 33 Correct Preds 33 Total Labels 33 Accuracy 33 Gen Accuracy 33 Correct Gen Preds 34 Correct Preds 34 Total Labels 34 Accuracy 34 Gen Accuracy 34 Correct Gen Preds 35 Correct Preds 35 Total Labels 35 Accuracy 35 Gen Accuracy 35 Correct Gen Preds 36 Correct Preds 36 Total Labels 36 Accuracy 36 Gen Accuracy 36
No log 0 0 1.5354 0.0057 1262.6022 875.1692 172.0 570.0 0.3018 170.0 0.2982 154.0 154.0 158.0 0.9747 0.9747 0.0 0.0 152.0 0.0 0.0 15.0 17.0 142.0 0.1197 0.1056 1.0 1.0 118.0 0.0085 0.0085 0.0 0.0 0.0 0.0 0.0
1.435 1.0 1 2.1188 0.0057 1742.3509 1207.7056 183.0 570.0 0.3211 183.0 0.3211 0.0 0.0 158.0 0.0 0.0 151.0 151.0 152.0 0.9934 0.9934 32.0 32.0 142.0 0.2254 0.2254 0.0 0.0 118.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
2.005 2.0 2 1.3228 0.0057 1087.7696 753.9844 166.0 570.0 0.2912 166.0 0.2912 157.0 157.0 158.0 0.9937 0.9937 7.0 7.0 152.0 0.0461 0.0461 2.0 2.0 142.0 0.0141 0.0141 0.0 0.0 118.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.9558 3.0 3 2.0441 0.0057 1680.9188 1165.1241 219.0 570.0 0.3842 219.0 0.3842 154.0 154.0 158.0 0.9747 0.9747 3.0 3.0 152.0 0.0197 0.0197 44.0 44.0 142.0 0.3099 0.3099 18.0 18.0 118.0 0.1525 0.1525 0.0 0.0 0.0 0.0 0.0
0.4507 4.0 4 1.4535 0.0057 1195.2711 828.4988 345.0 570.0 0.6053 344.0 0.6035 132.0 133.0 158.0 0.8418 0.8354 77.0 77.0 152.0 0.5066 0.5066 86.0 86.0 142.0 0.6056 0.6056 49.0 49.0 118.0 0.4153 0.4153 0.0 0.0 0.0 0.0 0.0
0.0879 5.0 5 1.6615 0.0057 1366.3466 947.0793 377.0 570.0 0.6614 374.0 0.6561 115.0 117.0 158.0 0.7405 0.7278 110.0 110.0 152.0 0.7237 0.7237 92.0 93.0 142.0 0.6549 0.6479 57.0 57.0 118.0 0.4831 0.4831 0.0 0.0 0.0 0.0 0.0
0.0036 6.0 6 3.0916 0.0057 2542.3237 1762.2045 378.0 570.0 0.6632 378.0 0.6632 131.0 131.0 158.0 0.8291 0.8291 96.0 96.0 152.0 0.6316 0.6316 85.0 85.0 142.0 0.5986 0.5986 66.0 66.0 118.0 0.5593 0.5593 0.0 0.0 0.0 0.0 0.0
0.0 7.0 7 4.4610 0.0057 3668.4395 2542.7685 369.0 570.0 0.6474 369.0 0.6474 136.0 136.0 158.0 0.8608 0.8608 83.0 83.0 152.0 0.5461 0.5461 85.0 85.0 142.0 0.5986 0.5986 65.0 65.0 118.0 0.5508 0.5508 0.0 0.0 0.0 0.0 0.0
0.0 8.0 8 5.2835 0.0057 4344.8373 3011.6117 364.0 570.0 0.6386 364.0 0.6386 137.0 137.0 158.0 0.8671 0.8671 81.0 81.0 152.0 0.5329 0.5329 81.0 81.0 142.0 0.5704 0.5704 65.0 65.0 118.0 0.5508 0.5508 0.0 0.0 0.0 0.0 0.0
0.0 9.0 9 5.7743 0.0057 4748.4422 3291.3693 360.0 570.0 0.6316 360.0 0.6316 138.0 138.0 158.0 0.8734 0.8734 79.0 79.0 152.0 0.5197 0.5197 80.0 80.0 142.0 0.5634 0.5634 63.0 63.0 118.0 0.5339 0.5339 0.0 0.0 0.0 0.0 0.0
0.0 10.0 10 6.0608 0.0057 4984.0139 3454.6552 355.0 570.0 0.6228 355.0 0.6228 139.0 139.0 158.0 0.8797 0.8797 76.0 76.0 152.0 0.5 0.5 79.0 79.0 142.0 0.5563 0.5563 61.0 61.0 118.0 0.5169 0.5169 0.0 0.0 0.0 0.0 0.0
0.0 11.0 11 6.2689 0.0057 5155.1789 3573.2977 353.0 570.0 0.6193 353.0 0.6193 137.0 137.0 158.0 0.8671 0.8671 77.0 77.0 152.0 0.5066 0.5066 79.0 79.0 142.0 0.5563 0.5563 60.0 60.0 118.0 0.5085 0.5085 0.0 0.0 0.0 0.0 0.0
0.0 12.0 12 6.3825 0.0057 5248.5200 3637.9969 352.0 570.0 0.6175 352.0 0.6175 138.0 138.0 158.0 0.8734 0.8734 75.0 75.0 152.0 0.4934 0.4934 78.0 78.0 142.0 0.5493 0.5493 61.0 61.0 118.0 0.5169 0.5169 0.0 0.0 0.0 0.0 0.0
0.0 13.0 13 6.4953 0.0057 5341.3420 3702.3361 349.0 570.0 0.6123 349.0 0.6123 139.0 139.0 158.0 0.8797 0.8797 74.0 74.0 152.0 0.4868 0.4868 78.0 78.0 142.0 0.5493 0.5493 58.0 58.0 118.0 0.4915 0.4915 0.0 0.0 0.0 0.0 0.0
0.0 14.0 14 6.5606 0.0057 5395.0053 3739.5327 352.0 570.0 0.6175 352.0 0.6175 141.0 141.0 158.0 0.8924 0.8924 75.0 75.0 152.0 0.4934 0.4934 76.0 76.0 142.0 0.5352 0.5352 60.0 60.0 118.0 0.5085 0.5085 0.0 0.0 0.0 0.0 0.0
0.0 15.0 15 6.6424 0.0057 5462.2665 3786.1546 345.0 570.0 0.6053 345.0 0.6053 139.0 139.0 158.0 0.8797 0.8797 73.0 73.0 152.0 0.4803 0.4803 75.0 75.0 142.0 0.5282 0.5282 58.0 58.0 118.0 0.4915 0.4915 0.0 0.0 0.0 0.0 0.0
0.0 16.0 16 6.7109 0.0057 5518.6330 3825.2249 344.0 570.0 0.6035 344.0 0.6035 140.0 140.0 158.0 0.8861 0.8861 71.0 71.0 152.0 0.4671 0.4671 75.0 75.0 142.0 0.5282 0.5282 58.0 58.0 118.0 0.4915 0.4915 0.0 0.0 0.0 0.0 0.0
0.0 17.0 17 6.7380 0.0057 5540.9314 3840.6810 343.0 570.0 0.6018 343.0 0.6018 140.0 140.0 158.0 0.8861 0.8861 70.0 70.0 152.0 0.4605 0.4605 75.0 75.0 142.0 0.5282 0.5282 58.0 58.0 118.0 0.4915 0.4915 0.0 0.0 0.0 0.0 0.0
0.0 18.0 18 6.7358 0.0057 5539.1150 3839.4220 341.0 570.0 0.5982 341.0 0.5982 140.0 140.0 158.0 0.8861 0.8861 70.0 70.0 152.0 0.4605 0.4605 73.0 73.0 142.0 0.5141 0.5141 58.0 58.0 118.0 0.4915 0.4915 0.0 0.0 0.0 0.0 0.0
0.0 19.0 19 6.7699 0.0057 5567.1183 3858.8323 342.0 570.0 0.6 342.0 0.6 141.0 141.0 158.0 0.8924 0.8924 70.0 70.0 152.0 0.4605 0.4605 73.0 73.0 142.0 0.5141 0.5141 58.0 58.0 118.0 0.4915 0.4915 0.0 0.0 0.0 0.0 0.0
0.0 20.0 20 6.8234 0.0057 5611.1323 3889.3405 336.0 570.0 0.5895 336.0 0.5895 140.0 140.0 158.0 0.8861 0.8861 67.0 67.0 152.0 0.4408 0.4408 73.0 73.0 142.0 0.5141 0.5141 56.0 56.0 118.0 0.4746 0.4746 0.0 0.0 0.0 0.0 0.0
0.0 21.0 21 6.7975 0.0057 5589.8180 3874.5666 338.0 570.0 0.5930 338.0 0.5930 141.0 141.0 158.0 0.8924 0.8924 68.0 68.0 152.0 0.4474 0.4474 73.0 73.0 142.0 0.5141 0.5141 56.0 56.0 118.0 0.4746 0.4746 0.0 0.0 0.0 0.0 0.0
0.0 22.0 22 6.8647 0.0057 5645.0579 3912.8560 339.0 570.0 0.5947 339.0 0.5947 140.0 140.0 158.0 0.8861 0.8861 68.0 68.0 152.0 0.4474 0.4474 74.0 74.0 142.0 0.5211 0.5211 57.0 57.0 118.0 0.4831 0.4831 0.0 0.0 0.0 0.0 0.0
0.0 23.0 23 6.8841 0.0057 5661.0481 3923.9395 332.0 570.0 0.5825 332.0 0.5825 140.0 140.0 158.0 0.8861 0.8861 68.0 68.0 152.0 0.4474 0.4474 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 24.0 24 6.8803 0.0057 5657.8912 3921.7514 334.0 570.0 0.5860 334.0 0.5860 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 70.0 70.0 142.0 0.4930 0.4930 56.0 56.0 118.0 0.4746 0.4746 0.0 0.0 0.0 0.0 0.0
0.0 25.0 25 6.9288 0.0057 5697.7619 3949.3876 334.0 570.0 0.5860 334.0 0.5860 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 71.0 71.0 142.0 0.5 0.5 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 26.0 26 6.9217 0.0057 5691.9978 3945.3922 333.0 570.0 0.5842 333.0 0.5842 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 70.0 70.0 142.0 0.4930 0.4930 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 27.0 27 6.8740 0.0057 5652.7017 3918.1542 334.0 570.0 0.5860 334.0 0.5860 142.0 142.0 158.0 0.8987 0.8987 69.0 69.0 152.0 0.4539 0.4539 68.0 68.0 142.0 0.4789 0.4789 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 28.0 28 6.9428 0.0057 5709.3209 3957.3997 335.0 570.0 0.5877 335.0 0.5877 142.0 142.0 158.0 0.8987 0.8987 68.0 68.0 152.0 0.4474 0.4474 71.0 71.0 142.0 0.5 0.5 54.0 54.0 118.0 0.4576 0.4576 0.0 0.0 0.0 0.0 0.0
0.0 29.0 29 6.9115 0.0057 5683.5462 3939.5340 332.0 570.0 0.5825 332.0 0.5825 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 30.0 30 6.9227 0.0057 5692.8032 3945.9505 334.0 570.0 0.5860 334.0 0.5860 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 70.0 70.0 142.0 0.4930 0.4930 56.0 56.0 118.0 0.4746 0.4746 0.0 0.0 0.0 0.0 0.0
0.0 31.0 31 6.9407 0.0057 5707.5592 3956.1786 334.0 570.0 0.5860 334.0 0.5860 142.0 142.0 158.0 0.8987 0.8987 67.0 67.0 152.0 0.4408 0.4408 69.0 69.0 142.0 0.4859 0.4859 56.0 56.0 118.0 0.4746 0.4746 0.0 0.0 0.0 0.0 0.0
0.0 32.0 32 6.9668 0.0057 5729.0557 3971.0788 332.0 570.0 0.5825 332.0 0.5825 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 33.0 33 6.9522 0.0057 5717.0407 3962.7506 334.0 570.0 0.5860 334.0 0.5860 142.0 142.0 158.0 0.8987 0.8987 68.0 68.0 152.0 0.4474 0.4474 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 34.0 34 6.9264 0.0057 5695.7994 3948.0273 334.0 570.0 0.5860 334.0 0.5860 142.0 142.0 158.0 0.8987 0.8987 67.0 67.0 152.0 0.4408 0.4408 70.0 70.0 142.0 0.4930 0.4930 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 35.0 35 6.9386 0.0057 5705.8397 3954.9867 333.0 570.0 0.5842 333.0 0.5842 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 70.0 70.0 142.0 0.4930 0.4930 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 36.0 36 6.9615 0.0057 5724.7186 3968.0726 332.0 570.0 0.5825 332.0 0.5825 141.0 141.0 158.0 0.8924 0.8924 68.0 68.0 152.0 0.4474 0.4474 69.0 69.0 142.0 0.4859 0.4859 54.0 54.0 118.0 0.4576 0.4576 0.0 0.0 0.0 0.0 0.0
0.0 37.0 37 6.9819 0.0057 5741.4335 3979.6584 332.0 570.0 0.5825 332.0 0.5825 141.0 141.0 158.0 0.8924 0.8924 68.0 68.0 152.0 0.4474 0.4474 68.0 68.0 142.0 0.4789 0.4789 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 38.0 38 6.9427 0.0057 5709.2265 3957.3343 334.0 570.0 0.5860 334.0 0.5860 143.0 143.0 158.0 0.9051 0.9051 68.0 68.0 152.0 0.4474 0.4474 69.0 69.0 142.0 0.4859 0.4859 54.0 54.0 118.0 0.4576 0.4576 0.0 0.0 0.0 0.0 0.0
0.0 39.0 39 6.8740 0.0057 5652.7332 3918.1761 331.0 570.0 0.5807 331.0 0.5807 142.0 142.0 158.0 0.8987 0.8987 67.0 67.0 152.0 0.4408 0.4408 68.0 68.0 142.0 0.4789 0.4789 54.0 54.0 118.0 0.4576 0.4576 0.0 0.0 0.0 0.0 0.0
0.0 40.0 40 6.9448 0.0057 5710.9396 3958.5217 332.0 570.0 0.5825 332.0 0.5825 142.0 142.0 158.0 0.8987 0.8987 67.0 67.0 152.0 0.4408 0.4408 68.0 68.0 142.0 0.4789 0.4789 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 41.0 41 6.9466 0.0057 5712.4185 3959.5468 330.0 570.0 0.5789 330.0 0.5789 140.0 140.0 158.0 0.8861 0.8861 67.0 67.0 152.0 0.4408 0.4408 68.0 68.0 142.0 0.4789 0.4789 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 42.0 42 6.9234 0.0057 5693.3344 3946.3187 333.0 570.0 0.5842 333.0 0.5842 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 69.0 69.0 142.0 0.4859 0.4859 56.0 56.0 118.0 0.4746 0.4746 0.0 0.0 0.0 0.0 0.0
0.0 43.0 43 6.9088 0.0057 5681.3730 3938.0277 333.0 570.0 0.5842 333.0 0.5842 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 70.0 70.0 142.0 0.4930 0.4930 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 44.0 44 6.9220 0.0057 5692.1990 3945.5317 334.0 570.0 0.5860 334.0 0.5860 142.0 142.0 158.0 0.8987 0.8987 68.0 68.0 152.0 0.4474 0.4474 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 45.0 45 6.9472 0.0057 5712.9266 3959.8990 334.0 570.0 0.5860 334.0 0.5860 142.0 142.0 158.0 0.8987 0.8987 68.0 68.0 152.0 0.4474 0.4474 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 46.0 46 6.9550 0.0057 5719.3551 3964.3548 334.0 570.0 0.5860 334.0 0.5860 143.0 143.0 158.0 0.9051 0.9051 67.0 67.0 152.0 0.4408 0.4408 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 47.0 47 6.9251 0.0057 5694.7518 3947.3012 333.0 570.0 0.5842 333.0 0.5842 142.0 142.0 158.0 0.8987 0.8987 67.0 67.0 152.0 0.4408 0.4408 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 48.0 48 6.9656 0.0057 5728.0317 3970.3690 333.0 570.0 0.5842 333.0 0.5842 142.0 142.0 158.0 0.8987 0.8987 68.0 68.0 152.0 0.4474 0.4474 68.0 68.0 142.0 0.4789 0.4789 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 49.0 49 6.9987 0.0057 5755.2488 3989.2345 333.0 570.0 0.5842 333.0 0.5842 141.0 141.0 158.0 0.8924 0.8924 68.0 68.0 152.0 0.4474 0.4474 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 50.0 50 6.9336 0.0057 5701.7125 3952.1260 334.0 570.0 0.5860 334.0 0.5860 142.0 142.0 158.0 0.8987 0.8987 68.0 68.0 152.0 0.4474 0.4474 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 51.0 51 6.9579 0.0057 5721.7273 3965.9991 331.0 570.0 0.5807 331.0 0.5807 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 68.0 68.0 142.0 0.4789 0.4789 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 52.0 52 6.9156 0.0057 5686.9695 3941.9068 333.0 570.0 0.5842 333.0 0.5842 141.0 141.0 158.0 0.8924 0.8924 68.0 68.0 152.0 0.4474 0.4474 70.0 70.0 142.0 0.4930 0.4930 54.0 54.0 118.0 0.4576 0.4576 0.0 0.0 0.0 0.0 0.0
0.0 53.0 53 6.9638 0.0057 5726.5963 3969.3741 332.0 570.0 0.5825 332.0 0.5825 141.0 141.0 158.0 0.8924 0.8924 67.0 67.0 152.0 0.4408 0.4408 69.0 69.0 142.0 0.4859 0.4859 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 54.0 54 6.9732 0.0057 5734.3381 3974.7403 334.0 570.0 0.5860 334.0 0.5860 142.0 142.0 158.0 0.8987 0.8987 67.0 67.0 152.0 0.4408 0.4408 70.0 70.0 142.0 0.4930 0.4930 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 55.0 55 6.9059 0.0057 5678.9905 3936.3762 335.0 570.0 0.5877 335.0 0.5877 142.0 142.0 158.0 0.8987 0.8987 68.0 68.0 152.0 0.4474 0.4474 70.0 70.0 142.0 0.4930 0.4930 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0
0.0 56.0 56 6.9887 0.0057 5747.0420 3983.5459 333.0 570.0 0.5842 333.0 0.5842 142.0 142.0 158.0 0.8987 0.8987 68.0 68.0 152.0 0.4474 0.4474 68.0 68.0 142.0 0.4789 0.4789 55.0 55.0 118.0 0.4661 0.4661 0.0 0.0 0.0 0.0 0.0

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
2
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for donoway/ARC-Easy_Llama-3.2-1B-yy20ooxg

Finetuned
(900)
this model