ARC-Challenge_Llama-3.2-1B-r0yf05qb

This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4.9607
  • Model Preparation Time: 0.0059
  • Mdl: 2139.8614
  • Accumulated Loss: 1483.2389
  • Correct Preds: 77.0
  • Total Preds: 299.0
  • Accuracy: 0.2575
  • Correct Gen Preds: 15.0
  • Gen Accuracy: 0.0502
  • Correct Gen Preds 32: 0.0
  • Correct Preds 32: 2.0
  • Total Labels 32: 64.0
  • Accuracy 32: 0.0312
  • Gen Accuracy 32: 0.0
  • Correct Gen Preds 33: 11.0
  • Correct Preds 33: 60.0
  • Total Labels 33: 73.0
  • Accuracy 33: 0.8219
  • Gen Accuracy 33: 0.1507
  • Correct Gen Preds 34: 2.0
  • Correct Preds 34: 10.0
  • Total Labels 34: 78.0
  • Accuracy 34: 0.1282
  • Gen Accuracy 34: 0.0256
  • Correct Gen Preds 35: 2.0
  • Correct Preds 35: 5.0
  • Total Labels 35: 83.0
  • Accuracy 35: 0.0602
  • Gen Accuracy 35: 0.0241
  • Correct Gen Preds 36: 0.0
  • Correct Preds 36: 0.0
  • Total Labels 36: 1.0
  • Accuracy 36: 0.0
  • Gen Accuracy 36: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 112
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.01
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Mdl Accumulated Loss Correct Preds Total Preds Accuracy Correct Gen Preds Gen Accuracy Correct Gen Preds 32 Correct Preds 32 Total Labels 32 Accuracy 32 Gen Accuracy 32 Correct Gen Preds 33 Correct Preds 33 Total Labels 33 Accuracy 33 Gen Accuracy 33 Correct Gen Preds 34 Correct Preds 34 Total Labels 34 Accuracy 34 Gen Accuracy 34 Correct Gen Preds 35 Correct Preds 35 Total Labels 35 Accuracy 35 Gen Accuracy 35 Correct Gen Preds 36 Correct Preds 36 Total Labels 36 Accuracy 36 Gen Accuracy 36
No log 0 0 1.6389 0.0059 706.9523 490.0220 66.0 299.0 0.2207 66.0 0.2207 62.0 62.0 64.0 0.9688 0.9688 0.0 0.0 73.0 0.0 0.0 4.0 4.0 78.0 0.0513 0.0513 0.0 0.0 83.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
1.7061 1.0 1 1.6389 0.0059 706.9523 490.0220 66.0 299.0 0.2207 66.0 0.2207 62.0 62.0 64.0 0.9688 0.9688 0.0 0.0 73.0 0.0 0.0 4.0 4.0 78.0 0.0513 0.0513 0.0 0.0 83.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
1.7128 2.0 2 2.9134 0.0059 1256.7296 871.0986 73.0 299.0 0.2441 73.0 0.2441 0.0 0.0 64.0 0.0 0.0 73.0 73.0 73.0 1.0 1.0 0.0 0.0 78.0 0.0 0.0 0.0 0.0 83.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
1.1696 3.0 3 2.2709 0.0059 979.5827 678.9950 76.0 299.0 0.2542 4.0 0.0134 1.0 22.0 64.0 0.3438 0.0156 2.0 50.0 73.0 0.6849 0.0274 1.0 1.0 78.0 0.0128 0.0128 0.0 3.0 83.0 0.0361 0.0 0.0 0.0 1.0 0.0 0.0
0.8314 4.0 4 1.8837 0.0059 812.5640 563.2265 75.0 299.0 0.2508 70.0 0.2341 0.0 0.0 64.0 0.0 0.0 68.0 73.0 73.0 1.0 0.9315 0.0 0.0 78.0 0.0 0.0 2.0 2.0 83.0 0.0241 0.0241 0.0 0.0 1.0 0.0 0.0
0.4234 5.0 5 2.7987 0.0059 1207.2848 836.8261 73.0 299.0 0.2441 71.0 0.2375 0.0 0.0 64.0 0.0 0.0 71.0 73.0 73.0 1.0 0.9726 0.0 0.0 78.0 0.0 0.0 0.0 0.0 83.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
0.1618 6.0 6 3.2839 0.0059 1416.5756 981.8954 75.0 299.0 0.2508 48.0 0.1605 0.0 0.0 64.0 0.0 0.0 47.0 73.0 73.0 1.0 0.6438 1.0 2.0 78.0 0.0256 0.0128 0.0 0.0 83.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
0.0192 7.0 7 3.7217 0.0059 1605.4263 1112.7967 74.0 299.0 0.2475 23.0 0.0769 0.0 0.0 64.0 0.0 0.0 20.0 69.0 73.0 0.9452 0.2740 1.0 3.0 78.0 0.0385 0.0128 2.0 2.0 83.0 0.0241 0.0241 0.0 0.0 1.0 0.0 0.0
0.0008 8.0 8 3.9983 0.0059 1724.7310 1195.4925 74.0 299.0 0.2475 17.0 0.0569 0.0 1.0 64.0 0.0156 0.0 14.0 66.0 73.0 0.9041 0.1918 1.0 5.0 78.0 0.0641 0.0128 2.0 2.0 83.0 0.0241 0.0241 0.0 0.0 1.0 0.0 0.0
0.0002 9.0 9 4.2129 0.0059 1817.3188 1259.6694 72.0 299.0 0.2408 16.0 0.0535 0.0 1.0 64.0 0.0156 0.0 13.0 64.0 73.0 0.8767 0.1781 1.0 5.0 78.0 0.0641 0.0128 2.0 2.0 83.0 0.0241 0.0241 0.0 0.0 1.0 0.0 0.0
0.0001 10.0 10 4.3845 0.0059 1891.3416 1310.9781 72.0 299.0 0.2408 16.0 0.0535 0.0 1.0 64.0 0.0156 0.0 13.0 63.0 73.0 0.8630 0.1781 1.0 5.0 78.0 0.0641 0.0128 2.0 3.0 83.0 0.0361 0.0241 0.0 0.0 1.0 0.0 0.0
0.0001 11.0 11 4.4976 0.0059 1940.0996 1344.7746 74.0 299.0 0.2475 15.0 0.0502 0.0 1.0 64.0 0.0156 0.0 12.0 63.0 73.0 0.8630 0.1644 1.0 6.0 78.0 0.0769 0.0128 2.0 4.0 83.0 0.0482 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 12.0 12 4.5953 0.0059 1982.2350 1373.9806 73.0 299.0 0.2441 15.0 0.0502 0.0 1.0 64.0 0.0156 0.0 12.0 61.0 73.0 0.8356 0.1644 2.0 7.0 78.0 0.0897 0.0256 1.0 4.0 83.0 0.0482 0.0120 0.0 0.0 1.0 0.0 0.0
0.0 13.0 13 4.6757 0.0059 2016.9523 1398.0448 74.0 299.0 0.2475 15.0 0.0502 0.0 1.0 64.0 0.0156 0.0 11.0 61.0 73.0 0.8356 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 4.0 83.0 0.0482 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 14.0 14 4.7315 0.0059 2041.0281 1414.7328 74.0 299.0 0.2475 14.0 0.0468 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 1.0 4.0 83.0 0.0482 0.0120 0.0 0.0 1.0 0.0 0.0
0.0 15.0 15 4.7760 0.0059 2060.2072 1428.0268 74.0 299.0 0.2475 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 4.0 83.0 0.0482 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 16.0 16 4.8067 0.0059 2073.4479 1437.2046 74.0 299.0 0.2475 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 4.0 83.0 0.0482 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 17.0 17 4.8319 0.0059 2084.3267 1444.7451 73.0 299.0 0.2441 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 59.0 73.0 0.8082 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 4.0 83.0 0.0482 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 18.0 18 4.8644 0.0059 2098.3559 1454.4695 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 19.0 19 4.8834 0.0059 2106.5338 1460.1380 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 20.0 20 4.8959 0.0059 2111.9169 1463.8692 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 21.0 21 4.9055 0.0059 2116.0587 1466.7401 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 22.0 22 4.9121 0.0059 2118.9142 1468.7194 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 23.0 23 4.9255 0.0059 2124.6990 1472.7291 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 24.0 24 4.9346 0.0059 2128.6175 1475.4452 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 25.0 25 4.9462 0.0059 2133.6370 1478.9245 74.0 299.0 0.2475 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 59.0 73.0 0.8082 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 26.0 26 4.9465 0.0059 2133.7463 1479.0002 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 27.0 27 4.9520 0.0059 2136.1279 1480.6510 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 59.0 73.0 0.8082 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 28.0 28 4.9488 0.0059 2134.7464 1479.6934 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 29.0 29 4.9554 0.0059 2137.6001 1481.6715 74.0 299.0 0.2475 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 59.0 73.0 0.8082 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 30.0 30 4.9554 0.0059 2137.6041 1481.6743 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 31.0 31 4.9607 0.0059 2139.8614 1483.2389 77.0 299.0 0.2575 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 10.0 78.0 0.1282 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 32.0 32 4.9608 0.0059 2139.9391 1483.2928 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 33.0 33 4.9612 0.0059 2140.0986 1483.4033 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 34.0 34 4.9602 0.0059 2139.6793 1483.1127 77.0 299.0 0.2575 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 10.0 78.0 0.1282 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 35.0 35 4.9670 0.0059 2142.5922 1485.1317 75.0 299.0 0.2508 16.0 0.0535 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 3.0 8.0 78.0 0.1026 0.0385 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 36.0 36 4.9635 0.0059 2141.0976 1484.0958 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 37.0 37 4.9663 0.0059 2142.2723 1484.9100 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 38.0 38 4.9711 0.0059 2144.3738 1486.3666 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 39.0 39 4.9587 0.0059 2139.0114 1482.6497 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 40.0 40 4.9709 0.0059 2144.2620 1486.2892 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 41.0 41 4.9670 0.0059 2142.5850 1485.1268 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 42.0 42 4.9677 0.0059 2142.8901 1485.3383 75.0 299.0 0.2508 16.0 0.0535 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 3.0 8.0 78.0 0.1026 0.0385 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 43.0 43 4.9700 0.0059 2143.8805 1486.0247 77.0 299.0 0.2575 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 10.0 78.0 0.1282 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 44.0 44 4.9743 0.0059 2145.7331 1487.3088 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 45.0 45 4.9644 0.0059 2141.4820 1484.3622 76.0 299.0 0.2542 16.0 0.0535 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 3.0 9.0 78.0 0.1154 0.0385 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 46.0 46 4.9724 0.0059 2144.9162 1486.7426 77.0 299.0 0.2575 16.0 0.0535 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 3.0 10.0 78.0 0.1282 0.0385 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 47.0 47 4.9662 0.0059 2142.2475 1484.8928 77.0 299.0 0.2575 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 10.0 78.0 0.1282 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 48.0 48 4.9728 0.0059 2145.0799 1486.8561 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 49.0 49 4.9655 0.0059 2141.9301 1484.6728 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 50.0 50 4.9758 0.0059 2146.4115 1487.7791 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 51.0 51 4.9633 0.0059 2141.0096 1484.0348 77.0 299.0 0.2575 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 10.0 78.0 0.1282 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 52.0 52 4.9658 0.0059 2142.0944 1484.7867 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 53.0 53 4.9699 0.0059 2143.8417 1485.9978 75.0 299.0 0.2508 16.0 0.0535 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 3.0 8.0 78.0 0.1026 0.0385 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 54.0 54 4.9681 0.0059 2143.0609 1485.4566 77.0 299.0 0.2575 16.0 0.0535 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 3.0 10.0 78.0 0.1282 0.0385 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 55.0 55 4.9687 0.0059 2143.3099 1485.6292 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 59.0 73.0 0.8082 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 56.0 56 4.9649 0.0059 2141.6995 1484.5129 76.0 299.0 0.2542 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 57.0 57 4.9705 0.0059 2144.0945 1486.1730 75.0 299.0 0.2508 16.0 0.0535 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 3.0 8.0 78.0 0.1026 0.0385 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 58.0 58 4.9699 0.0059 2143.8280 1485.9883 77.0 299.0 0.2575 16.0 0.0535 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 3.0 10.0 78.0 0.1282 0.0385 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 59.0 59 4.9683 0.0059 2143.1700 1485.5323 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 59.0 73.0 0.8082 0.1507 2.0 9.0 78.0 0.1154 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 60.0 60 4.9680 0.0059 2143.0255 1485.4321 77.0 299.0 0.2575 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 10.0 78.0 0.1282 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0
0.0 61.0 61 4.9672 0.0059 2142.6706 1485.1861 75.0 299.0 0.2508 15.0 0.0502 0.0 2.0 64.0 0.0312 0.0 11.0 60.0 73.0 0.8219 0.1507 2.0 8.0 78.0 0.1026 0.0256 2.0 5.0 83.0 0.0602 0.0241 0.0 0.0 1.0 0.0 0.0

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
8
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for donoway/ARC-Challenge_Llama-3.2-1B-r0yf05qb

Finetuned
(903)
this model