ARC-Easy_Llama-3.2-1B-eu8o2z86

This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.1932
  • Model Preparation Time: 0.0057
  • Mdl: 2625.8782
  • Accumulated Loss: 1820.1201
  • Correct Preds: 400.0
  • Total Preds: 570.0
  • Accuracy: 0.7018
  • Correct Gen Preds: 395.0
  • Gen Accuracy: 0.6930
  • Correct Gen Preds 32: 107.0
  • Correct Preds 32: 110.0
  • Total Labels 32: 158.0
  • Accuracy 32: 0.6962
  • Gen Accuracy 32: 0.6772
  • Correct Gen Preds 33: 102.0
  • Correct Preds 33: 104.0
  • Total Labels 33: 152.0
  • Accuracy 33: 0.6842
  • Gen Accuracy 33: 0.6711
  • Correct Gen Preds 34: 107.0
  • Correct Preds 34: 107.0
  • Total Labels 34: 142.0
  • Accuracy 34: 0.7535
  • Gen Accuracy 34: 0.7535
  • Correct Gen Preds 35: 79.0
  • Correct Preds 35: 79.0
  • Total Labels 35: 118.0
  • Accuracy 35: 0.6695
  • Gen Accuracy 35: 0.6695
  • Correct Gen Preds 36: 0.0
  • Correct Preds 36: 0.0
  • Total Labels 36: 0.0
  • Accuracy 36: 0.0
  • Gen Accuracy 36: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 112
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: constant
  • lr_scheduler_warmup_ratio: 0.001
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Mdl Accumulated Loss Correct Preds Total Preds Accuracy Correct Gen Preds Gen Accuracy Correct Gen Preds 32 Correct Preds 32 Total Labels 32 Accuracy 32 Gen Accuracy 32 Correct Gen Preds 33 Correct Preds 33 Total Labels 33 Accuracy 33 Gen Accuracy 33 Correct Gen Preds 34 Correct Preds 34 Total Labels 34 Accuracy 34 Gen Accuracy 34 Correct Gen Preds 35 Correct Preds 35 Total Labels 35 Accuracy 35 Gen Accuracy 35 Correct Gen Preds 36 Correct Preds 36 Total Labels 36 Accuracy 36 Gen Accuracy 36
No log 0 0 1.5354 0.0057 1262.6022 875.1692 172.0 570.0 0.3018 170.0 0.2982 154.0 154.0 158.0 0.9747 0.9747 0.0 0.0 152.0 0.0 0.0 15.0 17.0 142.0 0.1197 0.1056 1.0 1.0 118.0 0.0085 0.0085 0.0 0.0 0.0 0.0 0.0
0.8304 1.0 5 1.6613 0.0057 1366.1795 946.9635 158.0 570.0 0.2772 158.0 0.2772 158.0 158.0 158.0 1.0 1.0 0.0 0.0 152.0 0.0 0.0 0.0 0.0 142.0 0.0 0.0 0.0 0.0 118.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1331 2.0 10 1.6711 0.0057 1374.2163 952.5341 366.0 570.0 0.6421 327.0 0.5737 55.0 85.0 158.0 0.5380 0.3481 97.0 100.0 152.0 0.6579 0.6382 92.0 97.0 142.0 0.6831 0.6479 83.0 84.0 118.0 0.7119 0.7034 0.0 0.0 0.0 0.0 0.0
1.8083 3.0 15 1.4665 0.0057 1205.9506 835.9012 374.0 570.0 0.6561 374.0 0.6561 84.0 84.0 158.0 0.5316 0.5316 115.0 115.0 152.0 0.7566 0.7566 86.0 86.0 142.0 0.6056 0.6056 89.0 89.0 118.0 0.7542 0.7542 0.0 0.0 0.0 0.0 0.0
0.0009 4.0 20 1.8073 0.0057 1486.1779 1030.1400 392.0 570.0 0.6877 391.0 0.6860 120.0 121.0 158.0 0.7658 0.7595 92.0 92.0 152.0 0.6053 0.6053 97.0 97.0 142.0 0.6831 0.6831 82.0 82.0 118.0 0.6949 0.6949 0.0 0.0 0.0 0.0 0.0
0.0011 5.0 25 2.0931 0.0057 1721.2519 1193.0809 398.0 570.0 0.6982 384.0 0.6737 107.0 115.0 158.0 0.7278 0.6772 103.0 108.0 152.0 0.7105 0.6776 98.0 98.0 142.0 0.6901 0.6901 76.0 77.0 118.0 0.6525 0.6441 0.0 0.0 0.0 0.0 0.0
0.0 6.0 30 3.1932 0.0057 2625.8782 1820.1201 400.0 570.0 0.7018 395.0 0.6930 107.0 110.0 158.0 0.6962 0.6772 102.0 104.0 152.0 0.6842 0.6711 107.0 107.0 142.0 0.7535 0.7535 79.0 79.0 118.0 0.6695 0.6695 0.0 0.0 0.0 0.0 0.0
0.0 7.0 35 3.5309 0.0057 2903.6243 2012.6390 390.0 570.0 0.6842 380.0 0.6667 113.0 119.0 158.0 0.7532 0.7152 98.0 100.0 152.0 0.6579 0.6447 96.0 97.0 142.0 0.6831 0.6761 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 8.0 40 3.6311 0.0057 2986.0010 2069.7382 391.0 570.0 0.6860 379.0 0.6649 113.0 121.0 158.0 0.7658 0.7152 101.0 104.0 152.0 0.6842 0.6645 92.0 93.0 142.0 0.6549 0.6479 73.0 73.0 118.0 0.6186 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 9.0 45 3.6229 0.0057 2979.2623 2065.0673 387.0 570.0 0.6789 376.0 0.6596 110.0 116.0 158.0 0.7342 0.6962 103.0 106.0 152.0 0.6974 0.6776 92.0 94.0 142.0 0.6620 0.6479 71.0 71.0 118.0 0.6017 0.6017 0.0 0.0 0.0 0.0 0.0
0.0001 10.0 50 3.7087 0.0057 3049.7654 2113.9363 386.0 570.0 0.6772 374.0 0.6561 108.0 114.0 158.0 0.7215 0.6835 104.0 107.0 152.0 0.7039 0.6842 91.0 93.0 142.0 0.6549 0.6408 71.0 72.0 118.0 0.6102 0.6017 0.0 0.0 0.0 0.0 0.0
0.0 11.0 55 3.7644 0.0057 3095.5786 2145.6916 390.0 570.0 0.6842 381.0 0.6684 108.0 113.0 158.0 0.7152 0.6835 106.0 108.0 152.0 0.7105 0.6974 94.0 95.0 142.0 0.6690 0.6620 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 12.0 60 3.8077 0.0057 3131.2368 2170.4080 388.0 570.0 0.6807 378.0 0.6632 108.0 113.0 158.0 0.7152 0.6835 105.0 107.0 152.0 0.7039 0.6908 94.0 96.0 142.0 0.6761 0.6620 71.0 72.0 118.0 0.6102 0.6017 0.0 0.0 0.0 0.0 0.0
0.0 13.0 65 3.8259 0.0057 3146.2115 2180.7876 390.0 570.0 0.6842 377.0 0.6614 108.0 115.0 158.0 0.7278 0.6835 104.0 107.0 152.0 0.7039 0.6842 94.0 96.0 142.0 0.6761 0.6620 71.0 72.0 118.0 0.6102 0.6017 0.0 0.0 0.0 0.0 0.0
0.0 14.0 70 3.8349 0.0057 3153.6059 2185.9130 390.0 570.0 0.6842 378.0 0.6632 108.0 114.0 158.0 0.7215 0.6835 104.0 107.0 152.0 0.7039 0.6842 94.0 96.0 142.0 0.6761 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 15.0 75 3.8388 0.0057 3156.8245 2188.1440 390.0 570.0 0.6842 379.0 0.6649 108.0 114.0 158.0 0.7215 0.6835 105.0 107.0 152.0 0.7039 0.6908 94.0 96.0 142.0 0.6761 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 16.0 80 3.8554 0.0057 3170.4579 2197.5939 389.0 570.0 0.6825 378.0 0.6632 108.0 114.0 158.0 0.7215 0.6835 104.0 107.0 152.0 0.7039 0.6842 94.0 95.0 142.0 0.6690 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 17.0 85 3.8755 0.0057 3186.9384 2209.0174 390.0 570.0 0.6842 378.0 0.6632 108.0 114.0 158.0 0.7215 0.6835 104.0 107.0 152.0 0.7039 0.6842 94.0 96.0 142.0 0.6761 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 18.0 90 3.8996 0.0057 3206.8182 2222.7970 389.0 570.0 0.6825 379.0 0.6649 107.0 113.0 158.0 0.7152 0.6772 105.0 107.0 152.0 0.7039 0.6908 95.0 96.0 142.0 0.6761 0.6690 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 19.0 95 3.8870 0.0057 3196.4046 2215.5788 387.0 570.0 0.6789 379.0 0.6649 107.0 111.0 158.0 0.7025 0.6772 105.0 107.0 152.0 0.7039 0.6908 94.0 95.0 142.0 0.6690 0.6620 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 20.0 100 3.9127 0.0057 3217.5481 2230.2344 390.0 570.0 0.6842 379.0 0.6649 107.0 113.0 158.0 0.7152 0.6772 106.0 108.0 152.0 0.7105 0.6974 94.0 96.0 142.0 0.6761 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 21.0 105 3.8984 0.0057 3205.7990 2222.0906 389.0 570.0 0.6825 378.0 0.6632 107.0 113.0 158.0 0.7152 0.6772 105.0 107.0 152.0 0.7039 0.6908 94.0 96.0 142.0 0.6761 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 22.0 110 3.8939 0.0057 3202.1060 2219.5307 387.0 570.0 0.6789 376.0 0.6596 107.0 112.0 158.0 0.7089 0.6772 104.0 107.0 152.0 0.7039 0.6842 93.0 95.0 142.0 0.6690 0.6549 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 23.0 115 3.9171 0.0057 3221.2101 2232.7727 387.0 570.0 0.6789 377.0 0.6614 106.0 111.0 158.0 0.7025 0.6709 105.0 107.0 152.0 0.7039 0.6908 94.0 96.0 142.0 0.6761 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 24.0 120 3.9250 0.0057 3227.6495 2237.2361 388.0 570.0 0.6807 379.0 0.6649 106.0 112.0 158.0 0.7089 0.6709 105.0 107.0 152.0 0.7039 0.6908 95.0 95.0 142.0 0.6690 0.6690 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 25.0 125 3.9127 0.0057 3217.5873 2230.2616 388.0 570.0 0.6807 379.0 0.6649 107.0 112.0 158.0 0.7089 0.6772 106.0 108.0 152.0 0.7105 0.6974 94.0 95.0 142.0 0.6690 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 26.0 130 3.9464 0.0057 3245.3046 2249.4738 387.0 570.0 0.6789 379.0 0.6649 106.0 111.0 158.0 0.7025 0.6709 105.0 107.0 152.0 0.7039 0.6908 95.0 95.0 142.0 0.6690 0.6690 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 27.0 135 3.9408 0.0057 3240.6512 2246.2482 392.0 570.0 0.6877 381.0 0.6684 108.0 115.0 158.0 0.7278 0.6835 105.0 107.0 152.0 0.7039 0.6908 95.0 96.0 142.0 0.6761 0.6690 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 28.0 140 3.9600 0.0057 3256.4285 2257.1842 390.0 570.0 0.6842 380.0 0.6667 108.0 113.0 158.0 0.7152 0.6835 105.0 107.0 152.0 0.7039 0.6908 95.0 96.0 142.0 0.6761 0.6690 72.0 74.0 118.0 0.6271 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 29.0 145 3.9368 0.0057 3237.3594 2243.9665 388.0 570.0 0.6807 379.0 0.6649 108.0 113.0 158.0 0.7152 0.6835 105.0 107.0 152.0 0.7039 0.6908 94.0 95.0 142.0 0.6690 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 30.0 150 3.9711 0.0057 3265.5961 2263.5387 387.0 570.0 0.6789 377.0 0.6614 106.0 111.0 158.0 0.7025 0.6709 105.0 107.0 152.0 0.7039 0.6908 94.0 96.0 142.0 0.6761 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 31.0 155 3.9461 0.0057 3245.0572 2249.3022 387.0 570.0 0.6789 377.0 0.6614 107.0 113.0 158.0 0.7152 0.6772 104.0 106.0 152.0 0.6974 0.6842 94.0 95.0 142.0 0.6690 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 32.0 160 3.9415 0.0057 3241.2685 2246.6761 388.0 570.0 0.6807 379.0 0.6649 107.0 112.0 158.0 0.7089 0.6772 106.0 108.0 152.0 0.7105 0.6974 94.0 95.0 142.0 0.6690 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 33.0 165 3.9653 0.0057 3260.8288 2260.2343 387.0 570.0 0.6789 377.0 0.6614 107.0 113.0 158.0 0.7152 0.6772 105.0 107.0 152.0 0.7039 0.6908 94.0 95.0 142.0 0.6690 0.6620 71.0 72.0 118.0 0.6102 0.6017 0.0 0.0 0.0 0.0 0.0
0.0 34.0 170 3.9574 0.0057 3254.3170 2255.7207 389.0 570.0 0.6825 379.0 0.6649 108.0 113.0 158.0 0.7152 0.6835 105.0 107.0 152.0 0.7039 0.6908 94.0 96.0 142.0 0.6761 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 35.0 175 3.9492 0.0057 3247.5332 2251.0185 389.0 570.0 0.6825 380.0 0.6667 108.0 113.0 158.0 0.7152 0.6835 105.0 107.0 152.0 0.7039 0.6908 95.0 96.0 142.0 0.6761 0.6690 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 36.0 180 3.9505 0.0057 3248.6218 2251.7731 389.0 570.0 0.6825 379.0 0.6649 106.0 112.0 158.0 0.7089 0.6709 105.0 107.0 152.0 0.7039 0.6908 95.0 96.0 142.0 0.6761 0.6690 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 37.0 185 3.9734 0.0057 3267.4358 2264.8139 389.0 570.0 0.6825 378.0 0.6632 106.0 112.0 158.0 0.7089 0.6709 105.0 107.0 152.0 0.7039 0.6908 94.0 96.0 142.0 0.6761 0.6620 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 38.0 190 3.9491 0.0057 3247.4605 2250.9681 390.0 570.0 0.6842 380.0 0.6667 107.0 113.0 158.0 0.7152 0.6772 105.0 107.0 152.0 0.7039 0.6908 95.0 96.0 142.0 0.6761 0.6690 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 39.0 195 3.9277 0.0057 3229.9181 2238.8086 390.0 570.0 0.6842 380.0 0.6667 109.0 115.0 158.0 0.7278 0.6899 105.0 107.0 152.0 0.7039 0.6908 95.0 96.0 142.0 0.6761 0.6690 71.0 72.0 118.0 0.6102 0.6017 0.0 0.0 0.0 0.0 0.0
0.0 40.0 200 3.9626 0.0057 3258.5530 2258.6568 390.0 570.0 0.6842 380.0 0.6667 108.0 115.0 158.0 0.7278 0.6835 105.0 107.0 152.0 0.7039 0.6908 95.0 95.0 142.0 0.6690 0.6690 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 41.0 205 4.0134 0.0057 3300.3636 2287.6377 389.0 570.0 0.6825 378.0 0.6632 107.0 113.0 158.0 0.7152 0.6772 104.0 106.0 152.0 0.6974 0.6842 94.0 96.0 142.0 0.6761 0.6620 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 42.0 210 3.9807 0.0057 3273.4522 2268.9842 391.0 570.0 0.6860 381.0 0.6684 108.0 113.0 158.0 0.7152 0.6835 106.0 108.0 152.0 0.7105 0.6974 94.0 96.0 142.0 0.6761 0.6620 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 43.0 215 3.9913 0.0057 3282.1577 2275.0184 390.0 570.0 0.6842 378.0 0.6632 107.0 114.0 158.0 0.7215 0.6772 104.0 107.0 152.0 0.7039 0.6842 95.0 96.0 142.0 0.6761 0.6690 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 44.0 220 3.9678 0.0057 3262.8282 2261.6202 391.0 570.0 0.6860 382.0 0.6702 109.0 115.0 158.0 0.7278 0.6899 105.0 107.0 152.0 0.7039 0.6908 95.0 95.0 142.0 0.6690 0.6690 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 45.0 225 4.0035 0.0057 3292.2250 2281.9965 391.0 570.0 0.6860 382.0 0.6702 108.0 114.0 158.0 0.7215 0.6835 106.0 108.0 152.0 0.7105 0.6974 96.0 96.0 142.0 0.6761 0.6761 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 46.0 230 4.0396 0.0057 3321.8804 2302.5520 388.0 570.0 0.6807 379.0 0.6649 108.0 113.0 158.0 0.7152 0.6835 105.0 107.0 152.0 0.7039 0.6908 94.0 95.0 142.0 0.6690 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 47.0 235 4.0283 0.0057 3312.5759 2296.1026 388.0 570.0 0.6807 378.0 0.6632 106.0 112.0 158.0 0.7089 0.6709 106.0 108.0 152.0 0.7105 0.6974 94.0 95.0 142.0 0.6690 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 48.0 240 4.0130 0.0057 3300.0691 2287.4336 391.0 570.0 0.6860 380.0 0.6667 108.0 114.0 158.0 0.7215 0.6835 106.0 108.0 152.0 0.7105 0.6974 94.0 96.0 142.0 0.6761 0.6620 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 49.0 245 3.9892 0.0057 3280.4436 2273.8302 390.0 570.0 0.6842 380.0 0.6667 106.0 112.0 158.0 0.7089 0.6709 106.0 108.0 152.0 0.7105 0.6974 95.0 96.0 142.0 0.6761 0.6690 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 50.0 250 4.0152 0.0057 3301.8079 2288.6388 390.0 570.0 0.6842 380.0 0.6667 107.0 113.0 158.0 0.7152 0.6772 105.0 107.0 152.0 0.7039 0.6908 95.0 96.0 142.0 0.6761 0.6690 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 51.0 255 4.0175 0.0057 3303.6953 2289.9471 390.0 570.0 0.6842 380.0 0.6667 107.0 113.0 158.0 0.7152 0.6772 106.0 108.0 152.0 0.7105 0.6974 95.0 96.0 142.0 0.6761 0.6690 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 52.0 260 4.0003 0.0057 3289.5831 2280.1653 387.0 570.0 0.6789 379.0 0.6649 107.0 112.0 158.0 0.7089 0.6772 105.0 107.0 152.0 0.7039 0.6908 95.0 95.0 142.0 0.6690 0.6690 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 53.0 265 4.0292 0.0057 3313.3186 2296.6174 387.0 570.0 0.6789 379.0 0.6649 107.0 112.0 158.0 0.7089 0.6772 105.0 107.0 152.0 0.7039 0.6908 95.0 95.0 142.0 0.6690 0.6690 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 54.0 270 4.0353 0.0057 3318.4116 2300.1476 388.0 570.0 0.6807 380.0 0.6667 108.0 113.0 158.0 0.7152 0.6835 105.0 107.0 152.0 0.7039 0.6908 95.0 95.0 142.0 0.6690 0.6690 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0
0.0 55.0 275 4.0392 0.0057 3321.6008 2302.3582 389.0 570.0 0.6825 380.0 0.6667 107.0 113.0 158.0 0.7152 0.6772 105.0 107.0 152.0 0.7039 0.6908 95.0 95.0 142.0 0.6690 0.6690 73.0 74.0 118.0 0.6271 0.6186 0.0 0.0 0.0 0.0 0.0
0.0 56.0 280 4.0247 0.0057 3309.6287 2294.0598 390.0 570.0 0.6842 380.0 0.6667 107.0 114.0 158.0 0.7215 0.6772 106.0 108.0 152.0 0.7105 0.6974 95.0 95.0 142.0 0.6690 0.6690 72.0 73.0 118.0 0.6186 0.6102 0.0 0.0 0.0 0.0 0.0

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
2
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for donoway/ARC-Easy_Llama-3.2-1B-eu8o2z86

Finetuned
(900)
this model