ARC-Challenge_Llama-3.2-1B-kaj9v3a8

This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.3781
  • Model Preparation Time: 0.0058
  • Mdl: 1025.8224
  • Accumulated Loss: 711.0459
  • Correct Preds: 94.0
  • Total Preds: 299.0
  • Accuracy: 0.3144
  • Correct Gen Preds: 8.0
  • Gen Accuracy: 0.0268
  • Correct Gen Preds 32: 0.0
  • Correct Preds 32: 22.0
  • Total Labels 32: 64.0
  • Accuracy 32: 0.3438
  • Gen Accuracy 32: 0.0
  • Correct Gen Preds 33: 0.0
  • Correct Preds 33: 31.0
  • Total Labels 33: 73.0
  • Accuracy 33: 0.4247
  • Gen Accuracy 33: 0.0
  • Correct Gen Preds 34: 5.0
  • Correct Preds 34: 22.0
  • Total Labels 34: 78.0
  • Accuracy 34: 0.2821
  • Gen Accuracy 34: 0.0641
  • Correct Gen Preds 35: 3.0
  • Correct Preds 35: 19.0
  • Total Labels 35: 83.0
  • Accuracy 35: 0.2289
  • Gen Accuracy 35: 0.0361
  • Correct Gen Preds 36: 0.0
  • Correct Preds 36: 0.0
  • Total Labels 36: 1.0
  • Accuracy 36: 0.0
  • Gen Accuracy 36: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 112
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: constant
  • lr_scheduler_warmup_ratio: 0.001
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Mdl Accumulated Loss Correct Preds Total Preds Accuracy Correct Gen Preds Gen Accuracy Correct Gen Preds 32 Correct Preds 32 Total Labels 32 Accuracy 32 Gen Accuracy 32 Correct Gen Preds 33 Correct Preds 33 Total Labels 33 Accuracy 33 Gen Accuracy 33 Correct Gen Preds 34 Correct Preds 34 Total Labels 34 Accuracy 34 Gen Accuracy 34 Correct Gen Preds 35 Correct Preds 35 Total Labels 35 Accuracy 35 Gen Accuracy 35 Correct Gen Preds 36 Correct Preds 36 Total Labels 36 Accuracy 36 Gen Accuracy 36
No log 0 0 1.6389 0.0058 706.9523 490.0220 66.0 299.0 0.2207 66.0 0.2207 62.0 62.0 64.0 0.9688 0.9688 0.0 0.0 73.0 0.0 0.0 4.0 4.0 78.0 0.0513 0.0513 0.0 0.0 83.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
1.6217 1.0 2 3.0846 0.0058 1330.6119 922.3099 73.0 299.0 0.2441 73.0 0.2441 0.0 0.0 64.0 0.0 0.0 73.0 73.0 73.0 1.0 1.0 0.0 0.0 78.0 0.0 0.0 0.0 0.0 83.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
1.8042 2.0 4 1.6582 0.0058 715.3105 495.8155 64.0 299.0 0.2140 64.0 0.2140 64.0 64.0 64.0 1.0 1.0 0.0 0.0 73.0 0.0 0.0 0.0 0.0 78.0 0.0 0.0 0.0 0.0 83.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
0.9831 3.0 6 1.9878 0.0058 857.4570 594.3439 62.0 299.0 0.2074 31.0 0.1037 31.0 58.0 64.0 0.9062 0.4844 0.0 2.0 73.0 0.0274 0.0 0.0 1.0 78.0 0.0128 0.0 0.0 1.0 83.0 0.0120 0.0 0.0 0.0 1.0 0.0 0.0
1.1227 4.0 8 1.5926 0.0058 687.0006 476.1926 66.0 299.0 0.2207 35.0 0.1171 33.0 59.0 64.0 0.9219 0.5156 0.0 0.0 73.0 0.0 0.0 2.0 7.0 78.0 0.0897 0.0256 0.0 0.0 83.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
1.5365 5.0 10 1.4332 0.0058 618.2170 428.5154 80.0 299.0 0.2676 78.0 0.2609 58.0 59.0 64.0 0.9219 0.9062 14.0 14.0 73.0 0.1918 0.1918 4.0 5.0 78.0 0.0641 0.0513 2.0 2.0 83.0 0.0241 0.0241 0.0 0.0 1.0 0.0 0.0
1.3437 6.0 12 1.4601 0.0058 629.8535 436.5812 81.0 299.0 0.2709 27.0 0.0903 11.0 27.0 64.0 0.4219 0.1719 10.0 26.0 73.0 0.3562 0.1370 5.0 16.0 78.0 0.2051 0.0641 1.0 12.0 83.0 0.1446 0.0120 0.0 0.0 1.0 0.0 0.0
0.7483 7.0 14 1.7795 0.0058 767.6192 532.0731 76.0 299.0 0.2542 26.0 0.0870 20.0 53.0 64.0 0.8281 0.3125 2.0 11.0 73.0 0.1507 0.0274 3.0 7.0 78.0 0.0897 0.0385 1.0 5.0 83.0 0.0602 0.0120 0.0 0.0 1.0 0.0 0.0
0.4273 8.0 16 1.7187 0.0058 741.3764 513.8830 92.0 299.0 0.3077 12.0 0.0401 2.0 31.0 64.0 0.4844 0.0312 4.0 25.0 73.0 0.3425 0.0548 5.0 19.0 78.0 0.2436 0.0641 1.0 17.0 83.0 0.2048 0.0120 0.0 0.0 1.0 0.0 0.0
0.0267 9.0 18 2.3781 0.0058 1025.8224 711.0459 94.0 299.0 0.3144 8.0 0.0268 0.0 22.0 64.0 0.3438 0.0 0.0 31.0 73.0 0.4247 0.0 5.0 22.0 78.0 0.2821 0.0641 3.0 19.0 83.0 0.2289 0.0361 0.0 0.0 1.0 0.0 0.0
0.0121 10.0 20 2.6549 0.0058 1145.2173 793.8041 89.0 299.0 0.2977 11.0 0.0368 0.0 26.0 64.0 0.4062 0.0 3.0 31.0 73.0 0.4247 0.0411 4.0 17.0 78.0 0.2179 0.0513 4.0 15.0 83.0 0.1807 0.0482 0.0 0.0 1.0 0.0 0.0
0.0338 11.0 22 3.4477 0.0058 1487.2357 1030.8732 91.0 299.0 0.3043 10.0 0.0334 1.0 35.0 64.0 0.5469 0.0156 1.0 26.0 73.0 0.3562 0.0137 3.0 14.0 78.0 0.1795 0.0385 5.0 16.0 83.0 0.1928 0.0602 0.0 0.0 1.0 0.0 0.0
0.0002 12.0 24 3.9812 0.0058 1717.3427 1190.3712 87.0 299.0 0.2910 10.0 0.0334 1.0 30.0 64.0 0.4688 0.0156 1.0 26.0 73.0 0.3562 0.0137 2.0 14.0 78.0 0.1795 0.0256 6.0 17.0 83.0 0.2048 0.0723 0.0 0.0 1.0 0.0 0.0
0.0001 13.0 26 4.3205 0.0058 1863.6984 1291.8173 86.0 299.0 0.2876 13.0 0.0435 2.0 30.0 64.0 0.4688 0.0312 3.0 26.0 73.0 0.3562 0.0411 3.0 16.0 78.0 0.2051 0.0385 5.0 14.0 83.0 0.1687 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 14.0 28 4.5366 0.0058 1956.9398 1356.4473 89.0 299.0 0.2977 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 3.0 26.0 73.0 0.3562 0.0411 2.0 16.0 78.0 0.2051 0.0256 5.0 17.0 83.0 0.2048 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 15.0 30 4.6811 0.0058 2019.2794 1399.6578 89.0 299.0 0.2977 14.0 0.0468 3.0 29.0 64.0 0.4531 0.0469 4.0 24.0 73.0 0.3288 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 16.0 32 4.7816 0.0058 2062.6183 1429.6980 88.0 299.0 0.2943 12.0 0.0401 3.0 30.0 64.0 0.4688 0.0469 3.0 22.0 73.0 0.3014 0.0411 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 17.0 34 4.8408 0.0058 2088.1473 1447.3934 89.0 299.0 0.2977 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 22.0 73.0 0.3014 0.0548 2.0 19.0 78.0 0.2436 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 18.0 36 4.8937 0.0058 2110.9718 1463.2141 87.0 299.0 0.2910 13.0 0.0435 3.0 29.0 64.0 0.4531 0.0469 4.0 22.0 73.0 0.3014 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 19.0 38 4.9153 0.0058 2120.2857 1469.6701 88.0 299.0 0.2943 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 20.0 40 4.9474 0.0058 2134.1312 1479.2670 88.0 299.0 0.2943 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 21.0 42 4.9635 0.0058 2141.0853 1484.0872 89.0 299.0 0.2977 13.0 0.0435 3.0 31.0 64.0 0.4844 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 22.0 44 4.9713 0.0058 2144.4693 1486.4329 88.0 299.0 0.2943 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 22.0 73.0 0.3014 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 23.0 46 4.9759 0.0058 2146.4259 1487.7890 88.0 299.0 0.2943 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 24.0 48 4.9796 0.0058 2148.0201 1488.8940 87.0 299.0 0.2910 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 25.0 50 4.9919 0.0058 2153.3399 1492.5815 88.0 299.0 0.2943 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 26.0 52 4.9893 0.0058 2152.2123 1491.7999 88.0 299.0 0.2943 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 27.0 54 4.9926 0.0058 2153.6267 1492.7803 89.0 299.0 0.2977 14.0 0.0468 3.0 31.0 64.0 0.4844 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 28.0 56 4.9908 0.0058 2152.8635 1492.2513 88.0 299.0 0.2943 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 29.0 58 4.9906 0.0058 2152.7687 1492.1856 86.0 299.0 0.2876 13.0 0.0435 3.0 29.0 64.0 0.4531 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 30.0 60 4.9916 0.0058 2153.2147 1492.4947 89.0 299.0 0.2977 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 5.0 19.0 83.0 0.2289 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 31.0 62 4.9890 0.0058 2152.0878 1491.7136 89.0 299.0 0.2977 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 22.0 73.0 0.3014 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 32.0 64 4.9952 0.0058 2154.7667 1493.5705 87.0 299.0 0.2910 13.0 0.0435 3.0 31.0 64.0 0.4844 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 33.0 66 4.9926 0.0058 2153.6256 1492.7795 89.0 299.0 0.2977 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 19.0 83.0 0.2289 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 34.0 68 4.9868 0.0058 2151.1314 1491.0507 87.0 299.0 0.2910 14.0 0.0468 3.0 29.0 64.0 0.4531 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 19.0 78.0 0.2436 0.0256 5.0 19.0 83.0 0.2289 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 35.0 70 4.9850 0.0058 2150.3622 1490.5175 87.0 299.0 0.2910 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 36.0 72 5.0066 0.0058 2159.6634 1496.9646 87.0 299.0 0.2910 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 37.0 74 5.0066 0.0058 2159.6814 1496.9771 87.0 299.0 0.2910 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 38.0 76 5.0029 0.0058 2158.1010 1495.8816 88.0 299.0 0.2943 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 19.0 83.0 0.2289 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 39.0 78 4.9957 0.0058 2154.9806 1493.7187 89.0 299.0 0.2977 14.0 0.0468 3.0 31.0 64.0 0.4844 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 19.0 78.0 0.2436 0.0256 5.0 19.0 83.0 0.2289 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 40.0 80 4.9985 0.0058 2156.1785 1494.5490 88.0 299.0 0.2943 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 22.0 73.0 0.3014 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 41.0 82 5.0046 0.0058 2158.8302 1496.3871 86.0 299.0 0.2876 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 42.0 84 5.0020 0.0058 2157.6753 1495.5866 89.0 299.0 0.2977 13.0 0.0435 3.0 31.0 64.0 0.4844 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 43.0 86 5.0015 0.0058 2157.4668 1495.4420 88.0 299.0 0.2943 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 19.0 83.0 0.2289 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 44.0 88 4.9949 0.0058 2154.6395 1493.4823 89.0 299.0 0.2977 14.0 0.0468 3.0 31.0 64.0 0.4844 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 19.0 78.0 0.2436 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 45.0 90 5.0054 0.0058 2159.1554 1496.6125 86.0 299.0 0.2876 14.0 0.0468 3.0 29.0 64.0 0.4531 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 46.0 92 5.0031 0.0058 2158.1526 1495.9174 86.0 299.0 0.2876 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 19.0 73.0 0.2603 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 47.0 94 5.0044 0.0058 2158.7442 1496.3275 88.0 299.0 0.2943 14.0 0.0468 3.0 31.0 64.0 0.4844 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 48.0 96 5.0045 0.0058 2158.7689 1496.3445 86.0 299.0 0.2876 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 49.0 98 4.9983 0.0058 2156.0793 1494.4803 88.0 299.0 0.2943 13.0 0.0435 3.0 31.0 64.0 0.4844 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 19.0 83.0 0.2289 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 50.0 100 5.0049 0.0058 2158.9576 1496.4754 86.0 299.0 0.2876 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 51.0 102 5.0059 0.0058 2159.3619 1496.7556 87.0 299.0 0.2910 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 52.0 104 5.0100 0.0058 2161.1238 1497.9769 85.0 299.0 0.2843 14.0 0.0468 3.0 29.0 64.0 0.4531 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 53.0 106 5.0030 0.0058 2158.1205 1495.8951 87.0 299.0 0.2910 14.0 0.0468 3.0 31.0 64.0 0.4844 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 54.0 108 5.0083 0.0058 2160.4072 1497.4801 86.0 299.0 0.2876 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 19.0 73.0 0.2603 0.0548 2.0 19.0 78.0 0.2436 0.0256 5.0 18.0 83.0 0.2169 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 55.0 110 4.9989 0.0058 2156.3668 1494.6796 88.0 299.0 0.2943 14.0 0.0468 3.0 30.0 64.0 0.4688 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 5.0 19.0 83.0 0.2289 0.0602 0.0 0.0 1.0 0.0 0.0
0.0 56.0 112 4.9910 0.0058 2152.9416 1492.3054 88.0 299.0 0.2943 13.0 0.0435 3.0 31.0 64.0 0.4844 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 57.0 114 5.0090 0.0058 2160.7282 1497.7027 86.0 299.0 0.2876 13.0 0.0435 3.0 29.0 64.0 0.4531 0.0469 4.0 21.0 73.0 0.2877 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 58.0 116 4.9993 0.0058 2156.5316 1494.7938 87.0 299.0 0.2910 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 19.0 78.0 0.2436 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0
0.0 59.0 118 5.0173 0.0058 2164.3103 1500.1856 86.0 299.0 0.2876 13.0 0.0435 3.0 30.0 64.0 0.4688 0.0469 4.0 20.0 73.0 0.2740 0.0548 2.0 18.0 78.0 0.2308 0.0256 4.0 18.0 83.0 0.2169 0.0482 0.0 0.0 1.0 0.0 0.0

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
2
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for donoway/ARC-Challenge_Llama-3.2-1B-kaj9v3a8

Finetuned
(899)
this model