Datasets:
Checkpoint stringlengths 15 36 | Step int64 40k 4.85M | ASSIN2 RTE float64 0.33 0.88 | ASSIN2 STS float64 0 0.62 | BLUEX float64 0.19 0.53 | ENEM float64 0.19 0.61 | FAQUAD NLI float64 0.18 0.57 | HateBR float64 0.33 0.86 | OAB float64 0.23 0.44 | PT Hate Speech float64 0.23 0.65 | TweetSentBR float64 0.31 0.69 | ARC Challenge float64 0.29 0.49 | ASSIN2 ENT float64 0.53 0.72 | ASSIN2 PAR float64 0.62 0.72 | BELEBELE float64 0.23 0.78 | CALAME float64 0.5 0.59 | Global PIQA float64 0.65 0.81 | HellaSwag float64 0.37 0.49 | LAMBADA float64 0.47 0.63 | MMLU float64 0.23 0.56 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
stage1-step-40000 | 40,000 | 0.536992 | 0.135605 | 0.200278 | 0.198041 | 0.537398 | 0.587515 | 0.232346 | 0.555518 | 0.330392 | 0.288034 | 0.58075 | 0.6365 | 0.23 | 0.513006 | 0.65 | 0.365045 | 0.472346 | 0.22846 |
stage1-step-80000 | 80,000 | 0.710754 | 0.001046 | 0.191933 | 0.193842 | 0.439172 | 0.391485 | 0.236446 | 0.429173 | 0.491794 | 0.307692 | 0.64525 | 0.65525 | 0.278889 | 0.535645 | 0.65 | 0.384874 | 0.519891 | 0.231837 |
stage1-step-120000 | 120,000 | 0.570215 | 0.013354 | 0.190542 | 0.193842 | 0.439655 | 0.333333 | 0.230068 | 0.412293 | 0.409173 | 0.317094 | 0.651 | 0.67775 | 0.247778 | 0.540462 | 0.65 | 0.396901 | 0.544925 | 0.256304 |
stage1-step-160000 | 160,000 | 0.576529 | 0.204141 | 0.215577 | 0.188943 | 0.454499 | 0.432651 | 0.234169 | 0.433497 | 0.449791 | 0.330769 | 0.59775 | 0.67375 | 0.283333 | 0.557803 | 0.65 | 0.404161 | 0.521056 | 0.276418 |
stage1-step-200000 | 200,000 | 0.370803 | 0.231029 | 0.240612 | 0.205038 | 0.567982 | 0.599752 | 0.268793 | 0.453647 | 0.588209 | 0.354701 | 0.62725 | 0.66775 | 0.278889 | 0.568401 | 0.71 | 0.410987 | 0.527072 | 0.253753 |
stage1-step-240000 | 240,000 | 0.633791 | 0.151796 | 0.248957 | 0.215535 | 0.439655 | 0.333333 | 0.256492 | 0.411887 | 0.354398 | 0.364957 | 0.6485 | 0.707 | 0.267778 | 0.561657 | 0.7 | 0.418355 | 0.552688 | 0.265236 |
stage1-step-280000 | 280,000 | 0.542939 | 0.273415 | 0.264256 | 0.221134 | 0.486741 | 0.464166 | 0.275171 | 0.48194 | 0.411027 | 0.35641 | 0.6355 | 0.66375 | 0.3 | 0.551541 | 0.67 | 0.415646 | 0.520668 | 0.267262 |
stage1-step-320000 | 320,000 | 0.518775 | 0.159461 | 0.276773 | 0.250525 | 0.439655 | 0.356362 | 0.279727 | 0.412293 | 0.593764 | 0.374359 | 0.6495 | 0.673 | 0.348889 | 0.551541 | 0.74 | 0.422039 | 0.49641 | 0.325878 |
stage1-step-360000 | 360,000 | 0.626967 | 0.10102 | 0.25452 | 0.196641 | 0.439655 | 0.723466 | 0.256036 | 0.591185 | 0.601459 | 0.37265 | 0.59925 | 0.67675 | 0.364444 | 0.561175 | 0.76 | 0.42529 | 0.558704 | 0.332032 |
stage1-step-400000 | 400,000 | 0.629929 | 0.239424 | 0.328234 | 0.293912 | 0.481513 | 0.649884 | 0.289749 | 0.423943 | 0.601704 | 0.392308 | 0.66675 | 0.6805 | 0.401111 | 0.55973 | 0.73 | 0.426698 | 0.513293 | 0.367757 |
stage1-step-440000 | 440,000 | 0.388877 | 0.297344 | 0.332406 | 0.29881 | 0.439655 | 0.498656 | 0.306606 | 0.450795 | 0.573286 | 0.408547 | 0.68525 | 0.6715 | 0.408889 | 0.55973 | 0.72 | 0.426915 | 0.525131 | 0.37834 |
stage1-step-480000 | 480,000 | 0.717266 | 0.368759 | 0.343533 | 0.214136 | 0.473175 | 0.631218 | 0.324829 | 0.594541 | 0.440072 | 0.384615 | 0.6165 | 0.70925 | 0.446667 | 0.552023 | 0.71 | 0.427024 | 0.546672 | 0.384869 |
stage1-step-520000 | 520,000 | 0.641497 | 0.356893 | 0.350487 | 0.354094 | 0.439655 | 0.662592 | 0.312984 | 0.441088 | 0.604259 | 0.38547 | 0.65975 | 0.6845 | 0.415556 | 0.545279 | 0.76 | 0.433958 | 0.569959 | 0.386446 |
stage1-step-560000 | 560,000 | 0.64879 | 0.242047 | 0.342142 | 0.296711 | 0.439655 | 0.718582 | 0.316173 | 0.543387 | 0.641726 | 0.396581 | 0.64925 | 0.6755 | 0.414444 | 0.542871 | 0.7 | 0.433741 | 0.573064 | 0.396578 |
stage1-step-600000 | 600,000 | 0.613073 | 0.305794 | 0.337969 | 0.322603 | 0.439655 | 0.733815 | 0.303872 | 0.561042 | 0.605863 | 0.382051 | 0.70225 | 0.68 | 0.423333 | 0.552987 | 0.71 | 0.433416 | 0.559674 | 0.380516 |
stage1-step-640000 | 640,000 | 0.704408 | 0.08288 | 0.364395 | 0.337999 | 0.439655 | 0.5809 | 0.319362 | 0.443664 | 0.662351 | 0.373504 | 0.62875 | 0.682 | 0.437778 | 0.55106 | 0.74 | 0.435909 | 0.556569 | 0.387196 |
stage1-step-680000 | 680,000 | 0.771642 | 0.048225 | 0.379694 | 0.326802 | 0.439655 | 0.650577 | 0.337585 | 0.632979 | 0.651809 | 0.378632 | 0.65525 | 0.694 | 0.421111 | 0.546243 | 0.74 | 0.43515 | 0.533476 | 0.396953 |
stage1-step-720000 | 720,000 | 0.446375 | 0.380067 | 0.311544 | 0.356893 | 0.446593 | 0.784644 | 0.311162 | 0.55992 | 0.634502 | 0.399145 | 0.67525 | 0.682 | 0.428889 | 0.560212 | 0.7 | 0.438509 | 0.538715 | 0.404158 |
stage1-step-760000 | 760,000 | 0.74748 | 0.264636 | 0.349096 | 0.43247 | 0.550061 | 0.768703 | 0.307517 | 0.595714 | 0.626087 | 0.411111 | 0.62925 | 0.67325 | 0.472222 | 0.555395 | 0.74 | 0.432441 | 0.572094 | 0.413539 |
stage1-step-800000 | 800,000 | 0.668896 | 0.340173 | 0.368567 | 0.431071 | 0.439655 | 0.787854 | 0.312073 | 0.493537 | 0.514028 | 0.375214 | 0.647 | 0.68225 | 0.454444 | 0.566956 | 0.71 | 0.435475 | 0.587231 | 0.42855 |
stage1-step-840000 | 840,000 | 0.774281 | 0.361727 | 0.38943 | 0.416375 | 0.439655 | 0.767892 | 0.342597 | 0.600714 | 0.646696 | 0.394017 | 0.65625 | 0.701 | 0.465556 | 0.553468 | 0.74 | 0.439809 | 0.541044 | 0.424197 |
stage1-step-880000 | 880,000 | 0.371012 | 0.483009 | 0.392211 | 0.448565 | 0.490614 | 0.663828 | 0.330752 | 0.44397 | 0.684357 | 0.388034 | 0.576 | 0.6435 | 0.487778 | 0.555877 | 0.72 | 0.438942 | 0.572288 | 0.426974 |
stage1-step-920000 | 920,000 | 0.633963 | 0.162747 | 0.399166 | 0.40098 | 0.439655 | 0.792806 | 0.344875 | 0.369791 | 0.641099 | 0.382051 | 0.638 | 0.66775 | 0.514444 | 0.563102 | 0.74 | 0.441001 | 0.520474 | 0.42127 |
stage1-step-960000 | 960,000 | 0.405934 | 0.192953 | 0.386648 | 0.36599 | 0.439172 | 0.723119 | 0.368109 | 0.572075 | 0.674992 | 0.398291 | 0.61025 | 0.71225 | 0.52 | 0.542871 | 0.69 | 0.441868 | 0.536387 | 0.435005 |
stage1-step-1000000 | 1,000,000 | 0.349468 | 0.177411 | 0.369958 | 0.370889 | 0.47215 | 0.658071 | 0.312984 | 0.265757 | 0.545916 | 0.382051 | 0.6665 | 0.648 | 0.482222 | 0.552023 | 0.71 | 0.444577 | 0.548224 | 0.411363 |
stage1-step-1040000 | 1,040,000 | 0.75217 | 0.354877 | 0.376912 | 0.409377 | 0.439655 | 0.729902 | 0.344419 | 0.480007 | 0.580603 | 0.393162 | 0.612 | 0.71875 | 0.514444 | 0.561657 | 0.72 | 0.441218 | 0.564914 | 0.444311 |
stage1-step-1080000 | 1,080,000 | 0.44635 | 0.239831 | 0.374131 | 0.386284 | 0.439655 | 0.825989 | 0.336219 | 0.327819 | 0.631917 | 0.408547 | 0.669 | 0.701 | 0.476667 | 0.565029 | 0.69 | 0.446094 | 0.548418 | 0.373612 |
stage1-step-1120000 | 1,120,000 | 0.641409 | 0.206447 | 0.318498 | 0.324003 | 0.439655 | 0.76579 | 0.302506 | 0.34413 | 0.650984 | 0.384615 | 0.6775 | 0.69325 | 0.442222 | 0.546724 | 0.77 | 0.44176 | 0.549195 | 0.395977 |
stage1-step-1160000 | 1,160,000 | 0.503985 | 0.279669 | 0.381085 | 0.364591 | 0.439655 | 0.760588 | 0.36082 | 0.42655 | 0.663106 | 0.399145 | 0.6175 | 0.64025 | 0.503333 | 0.542871 | 0.71 | 0.442627 | 0.552688 | 0.425248 |
stage1-step-1200000 | 1,200,000 | 0.764705 | 0.074547 | 0.394993 | 0.303009 | 0.439655 | 0.822186 | 0.321185 | 0.426195 | 0.643325 | 0.433333 | 0.62925 | 0.681 | 0.502222 | 0.560694 | 0.69 | 0.439159 | 0.547836 | 0.438307 |
stage1-step-1240000 | 1,240,000 | 0.33424 | 0.138498 | 0.365786 | 0.377187 | 0.462935 | 0.75599 | 0.343964 | 0.276361 | 0.662321 | 0.395726 | 0.61075 | 0.624 | 0.472222 | 0.55106 | 0.73 | 0.446636 | 0.588007 | 0.415641 |
stage1-step-1280000 | 1,280,000 | 0.737883 | 0.16205 | 0.400556 | 0.430371 | 0.439655 | 0.783218 | 0.349886 | 0.595693 | 0.680865 | 0.403419 | 0.6785 | 0.64 | 0.518889 | 0.561175 | 0.71 | 0.443927 | 0.528624 | 0.444086 |
stage1-step-1320000 | 1,320,000 | 0.719702 | 0.218825 | 0.382476 | 0.43247 | 0.439655 | 0.758422 | 0.330752 | 0.607118 | 0.665894 | 0.408547 | 0.66425 | 0.713 | 0.483333 | 0.562139 | 0.74 | 0.448586 | 0.567048 | 0.42885 |
stage1-step-1360000 | 1,360,000 | 0.813828 | 0.202461 | 0.401947 | 0.426172 | 0.439655 | 0.709825 | 0.369932 | 0.623485 | 0.574544 | 0.406838 | 0.6775 | 0.687 | 0.523333 | 0.554913 | 0.72 | 0.443168 | 0.535804 | 0.438307 |
stage1-step-1400000 | 1,400,000 | 0.804183 | 0.258435 | 0.388039 | 0.373688 | 0.439655 | 0.683504 | 0.326196 | 0.546128 | 0.513116 | 0.366667 | 0.6465 | 0.69525 | 0.495556 | 0.554432 | 0.74 | 0.42724 | 0.543373 | 0.41444 |
stage1-step-1440000 | 1,440,000 | 0.813097 | 0.379111 | 0.418637 | 0.410777 | 0.439655 | 0.840828 | 0.340774 | 0.494041 | 0.646765 | 0.408547 | 0.621 | 0.68525 | 0.54 | 0.545279 | 0.74 | 0.446094 | 0.529012 | 0.420669 |
stage1-step-1480000 | 1,480,000 | 0.767955 | 0.249124 | 0.421419 | 0.460462 | 0.439655 | 0.741377 | 0.356264 | 0.593723 | 0.656245 | 0.416239 | 0.62125 | 0.66475 | 0.522222 | 0.550096 | 0.77 | 0.447286 | 0.525325 | 0.423597 |
stage1-step-1520000 | 1,520,000 | 0.527339 | 0.176503 | 0.417246 | 0.406578 | 0.439655 | 0.604555 | 0.372665 | 0.556135 | 0.598662 | 0.416239 | 0.6805 | 0.71 | 0.551111 | 0.558285 | 0.75 | 0.447828 | 0.547836 | 0.418868 |
stage1-step-1560000 | 1,560,000 | 0.620003 | 0.413816 | 0.396384 | 0.451365 | 0.512525 | 0.790571 | 0.369021 | 0.283489 | 0.575772 | 0.401709 | 0.62625 | 0.6865 | 0.534444 | 0.572254 | 0.73 | 0.444143 | 0.594411 | 0.4284 |
stage1-step-1600000 | 1,600,000 | 0.622543 | 0.369884 | 0.399166 | 0.459762 | 0.478948 | 0.828559 | 0.381777 | 0.403448 | 0.637619 | 0.412821 | 0.6265 | 0.677 | 0.536667 | 0.558285 | 0.77 | 0.443168 | 0.562391 | 0.431927 |
stage1-step-1640000 | 1,640,000 | 0.422941 | 0.218492 | 0.40612 | 0.396081 | 0.488254 | 0.759904 | 0.347608 | 0.356526 | 0.544935 | 0.405983 | 0.60275 | 0.61775 | 0.508889 | 0.563584 | 0.72 | 0.445877 | 0.558898 | 0.438232 |
stage1-step-1680000 | 1,680,000 | 0.663484 | 0.256198 | 0.400556 | 0.373688 | 0.439655 | 0.733749 | 0.370387 | 0.589519 | 0.652934 | 0.405983 | 0.584 | 0.6705 | 0.562222 | 0.560694 | 0.76 | 0.44501 | 0.538133 | 0.431777 |
stage1-step-1720000 | 1,720,000 | 0.592906 | 0.119232 | 0.417246 | 0.43247 | 0.439655 | 0.811034 | 0.363554 | 0.448122 | 0.593502 | 0.403419 | 0.5945 | 0.689 | 0.531111 | 0.557803 | 0.72 | 0.447936 | 0.579468 | 0.43628 |
stage1-step-1760000 | 1,760,000 | 0.711667 | 0.136977 | 0.443672 | 0.428971 | 0.439655 | 0.726473 | 0.382232 | 0.595432 | 0.655417 | 0.412821 | 0.6435 | 0.6945 | 0.542222 | 0.575145 | 0.75 | 0.446202 | 0.591694 | 0.447838 |
stage1-step-1800000 | 1,800,000 | 0.757219 | 0.35148 | 0.418637 | 0.422673 | 0.439655 | 0.773955 | 0.358087 | 0.574946 | 0.553738 | 0.417094 | 0.66475 | 0.691 | 0.541111 | 0.575626 | 0.75 | 0.447069 | 0.570153 | 0.443636 |
stage1-step-1840000 | 1,840,000 | 0.77229 | 0.351594 | 0.433936 | 0.321204 | 0.439655 | 0.654267 | 0.36082 | 0.588053 | 0.616521 | 0.422222 | 0.65075 | 0.6825 | 0.592222 | 0.573218 | 0.76 | 0.447611 | 0.565108 | 0.42885 |
stage1-step-1880000 | 1,880,000 | 0.765616 | 0.223381 | 0.410292 | 0.414276 | 0.439655 | 0.767007 | 0.357631 | 0.502762 | 0.663326 | 0.426496 | 0.66175 | 0.67575 | 0.554444 | 0.566956 | 0.79 | 0.447611 | 0.542014 | 0.421645 |
stage1-step-1920000 | 1,920,000 | 0.759277 | 0.37104 | 0.397775 | 0.421274 | 0.439655 | 0.669001 | 0.368565 | 0.634621 | 0.662505 | 0.419658 | 0.66725 | 0.6515 | 0.585556 | 0.561175 | 0.76 | 0.45032 | 0.549389 | 0.440633 |
stage1-step-1960000 | 1,960,000 | 0.757078 | 0.410956 | 0.40751 | 0.444367 | 0.439655 | 0.814905 | 0.363554 | 0.474167 | 0.641162 | 0.409402 | 0.67925 | 0.64975 | 0.555556 | 0.566956 | 0.73 | 0.446961 | 0.537163 | 0.444686 |
stage1-step-2000000 | 2,000,000 | 0.763774 | 0.262845 | 0.388039 | 0.413576 | 0.439655 | 0.725094 | 0.335763 | 0.588966 | 0.406028 | 0.42906 | 0.7155 | 0.709 | 0.528889 | 0.564547 | 0.73 | 0.45032 | 0.60489 | 0.439282 |
stage1-step-2040000 | 2,040,000 | 0.526257 | 0.315621 | 0.442281 | 0.405178 | 0.439655 | 0.607641 | 0.358998 | 0.581838 | 0.689445 | 0.423077 | 0.65625 | 0.701 | 0.55 | 0.55684 | 0.73 | 0.454762 | 0.583738 | 0.423296 |
stage1-step-2080000 | 2,080,000 | 0.757609 | 0.310739 | 0.344924 | 0.380686 | 0.439655 | 0.752212 | 0.319362 | 0.564102 | 0.60616 | 0.409402 | 0.6845 | 0.70875 | 0.524444 | 0.562139 | 0.73 | 0.447502 | 0.555211 | 0.423446 |
stage1-step-2120000 | 2,120,000 | 0.547098 | 0.3259 | 0.35605 | 0.36809 | 0.438202 | 0.82753 | 0.336674 | 0.379328 | 0.594332 | 0.407692 | 0.67525 | 0.68925 | 0.555556 | 0.560694 | 0.76 | 0.449561 | 0.555987 | 0.446788 |
stage1-step-2160000 | 2,160,000 | 0.757382 | 0.169808 | 0.435327 | 0.473058 | 0.439655 | 0.780662 | 0.365376 | 0.44397 | 0.626133 | 0.426496 | 0.573 | 0.66225 | 0.55 | 0.570328 | 0.74 | 0.446527 | 0.550747 | 0.457145 |
stage1-step-2200000 | 2,200,000 | 0.354757 | 0.243983 | 0.425591 | 0.389783 | 0.437229 | 0.76647 | 0.350342 | 0.345233 | 0.601742 | 0.425641 | 0.6045 | 0.649 | 0.543333 | 0.568882 | 0.71 | 0.45162 | 0.556375 | 0.453843 |
stage1-step-2240000 | 2,240,000 | 0.688481 | 0.297774 | 0.436718 | 0.438069 | 0.439655 | 0.492018 | 0.364465 | 0.525901 | 0.534078 | 0.401709 | 0.7135 | 0.71275 | 0.571111 | 0.55395 | 0.74 | 0.452595 | 0.573064 | 0.455194 |
stage1-step-2280000 | 2,280,000 | 0.655297 | 0.353886 | 0.410292 | 0.449965 | 0.439655 | 0.792179 | 0.358998 | 0.333619 | 0.493175 | 0.423077 | 0.65575 | 0.674 | 0.606667 | 0.572254 | 0.78 | 0.45422 | 0.563749 | 0.458346 |
stage1-step-2320000 | 2,320,000 | 0.716143 | 0.067279 | 0.397775 | 0.459762 | 0.439655 | 0.787588 | 0.379954 | 0.519259 | 0.534587 | 0.436752 | 0.68425 | 0.702 | 0.575556 | 0.561657 | 0.75 | 0.451728 | 0.58626 | 0.460222 |
stage1-step-2360000 | 2,360,000 | 0.819996 | 0.400314 | 0.425591 | 0.477957 | 0.439172 | 0.639812 | 0.392711 | 0.597826 | 0.678015 | 0.425641 | 0.63225 | 0.6945 | 0.574444 | 0.568401 | 0.77 | 0.449886 | 0.573452 | 0.448814 |
stage1-step-2400000 | 2,400,000 | 0.763702 | 0.434753 | 0.421419 | 0.495451 | 0.439655 | 0.660294 | 0.353986 | 0.575253 | 0.534534 | 0.437607 | 0.615 | 0.71475 | 0.584444 | 0.569846 | 0.73 | 0.453354 | 0.565302 | 0.467202 |
stage1-step-2440000 | 2,440,000 | 0.759664 | 0.261602 | 0.422809 | 0.447866 | 0.447127 | 0.644745 | 0.398633 | 0.535748 | 0.541116 | 0.428205 | 0.65925 | 0.70275 | 0.577778 | 0.555877 | 0.71 | 0.445444 | 0.572288 | 0.448664 |
stage1-step-2480000 | 2,480,000 | 0.817129 | 0.42701 | 0.4242 | 0.428272 | 0.439655 | 0.747658 | 0.38451 | 0.534613 | 0.66992 | 0.412821 | 0.66925 | 0.69825 | 0.583333 | 0.566956 | 0.75 | 0.448803 | 0.565108 | 0.445137 |
stage1-step-2520000 | 2,520,000 | 0.486899 | 0.270195 | 0.378303 | 0.370889 | 0.439655 | 0.716577 | 0.367198 | 0.464587 | 0.549884 | 0.428205 | 0.66725 | 0.6565 | 0.56 | 0.55973 | 0.74 | 0.450753 | 0.534834 | 0.418043 |
stage1-step-2560000 | 2,560,000 | 0.576771 | 0.417793 | 0.431154 | 0.462561 | 0.439655 | 0.78243 | 0.365376 | 0.287729 | 0.364914 | 0.412821 | 0.66475 | 0.6775 | 0.571111 | 0.540944 | 0.77 | 0.449236 | 0.545702 | 0.453392 |
stage1-step-2600000 | 2,600,000 | 0.599538 | 0.387852 | 0.364395 | 0.422673 | 0.439655 | 0.796391 | 0.35262 | 0.389892 | 0.596435 | 0.421368 | 0.64175 | 0.655 | 0.536667 | 0.566474 | 0.74 | 0.452595 | 0.594799 | 0.441684 |
stage1-step-2640000 | 2,640,000 | 0.782638 | 0.397861 | 0.414465 | 0.452764 | 0.439655 | 0.603524 | 0.380866 | 0.603855 | 0.651406 | 0.422222 | 0.6415 | 0.675 | 0.571111 | 0.569364 | 0.75 | 0.453679 | 0.563749 | 0.466527 |
stage1-step-2680000 | 2,680,000 | 0.556982 | 0.457701 | 0.410292 | 0.39888 | 0.476589 | 0.839992 | 0.335308 | 0.42553 | 0.52462 | 0.419658 | 0.5835 | 0.6345 | 0.558889 | 0.572254 | 0.74 | 0.45357 | 0.588395 | 0.440183 |
stage1-step-2720000 | 2,720,000 | 0.728107 | 0.372791 | 0.40751 | 0.458362 | 0.447127 | 0.746801 | 0.361731 | 0.55825 | 0.591587 | 0.434188 | 0.64325 | 0.6815 | 0.553333 | 0.569846 | 0.75 | 0.456387 | 0.588395 | 0.448964 |
stage1-step-2760000 | 2,760,000 | 0.832323 | 0.429115 | 0.452017 | 0.458362 | 0.439655 | 0.431672 | 0.407745 | 0.542378 | 0.598615 | 0.444444 | 0.64075 | 0.72325 | 0.56 | 0.561175 | 0.74 | 0.454329 | 0.602173 | 0.470429 |
stage1-step-2800000 | 2,800,000 | 0.785055 | 0.492661 | 0.429764 | 0.473758 | 0.439655 | 0.664052 | 0.368565 | 0.593217 | 0.552384 | 0.409402 | 0.6805 | 0.68375 | 0.588889 | 0.574663 | 0.77 | 0.451728 | 0.558898 | 0.464575 |
stage1-step-2840000 | 2,840,000 | 0.51668 | 0.513608 | 0.342142 | 0.412876 | 0.452754 | 0.828939 | 0.331663 | 0.287134 | 0.668428 | 0.423932 | 0.68175 | 0.649 | 0.482222 | 0.568882 | 0.78 | 0.451186 | 0.586454 | 0.384719 |
stage1-step-2880000 | 2,880,000 | 0.835831 | 0.486358 | 0.410292 | 0.494052 | 0.439655 | 0.609525 | 0.366743 | 0.613353 | 0.517379 | 0.426496 | 0.65475 | 0.72425 | 0.597778 | 0.55973 | 0.75 | 0.452703 | 0.482243 | 0.450916 |
stage1-step-2920000 | 2,920,000 | 0.778151 | 0.308156 | 0.453408 | 0.46606 | 0.439655 | 0.653728 | 0.329385 | 0.598088 | 0.572093 | 0.413675 | 0.63675 | 0.70775 | 0.542222 | 0.568401 | 0.78 | 0.451186 | 0.578692 | 0.46465 |
stage1-step-2960000 | 2,960,000 | 0.333333 | 0.321586 | 0.433936 | 0.482855 | 0.177215 | 0.349013 | 0.376765 | 0.229864 | 0.310431 | 0.420513 | 0.622 | 0.6435 | 0.546667 | 0.570809 | 0.77 | 0.451186 | 0.574423 | 0.440784 |
stage1-step-3000000 | 3,000,000 | 0.664962 | 0.423549 | 0.37274 | 0.363191 | 0.439655 | 0.708952 | 0.336219 | 0.603873 | 0.458784 | 0.416239 | 0.701 | 0.6995 | 0.52 | 0.554913 | 0.73 | 0.450645 | 0.533088 | 0.435605 |
stage1-step-3040000 | 3,040,000 | 0.807807 | 0.379142 | 0.456189 | 0.477957 | 0.444992 | 0.7364 | 0.369021 | 0.60501 | 0.614295 | 0.423932 | 0.65975 | 0.67325 | 0.564444 | 0.551541 | 0.77 | 0.451728 | 0.585872 | 0.460447 |
stage1-step-3080000 | 3,080,000 | 0.792152 | 0.36324 | 0.460362 | 0.470259 | 0.439655 | 0.722993 | 0.335308 | 0.589192 | 0.538793 | 0.433333 | 0.67925 | 0.699 | 0.562222 | 0.564547 | 0.81 | 0.454762 | 0.54803 | 0.45647 |
stage1-step-3120000 | 3,120,000 | 0.798282 | 0.463013 | 0.431154 | 0.415675 | 0.439655 | 0.687969 | 0.385877 | 0.567825 | 0.558273 | 0.433333 | 0.65225 | 0.6575 | 0.548889 | 0.579961 | 0.77 | 0.451837 | 0.577722 | 0.459997 |
stage1-step-3160000 | 3,160,000 | 0.67055 | 0.322891 | 0.442281 | 0.46746 | 0.439655 | 0.688856 | 0.377677 | 0.516772 | 0.473278 | 0.417949 | 0.636 | 0.71575 | 0.596667 | 0.567437 | 0.78 | 0.451837 | 0.604696 | 0.463825 |
stage1-step-3200000 | 3,200,000 | 0.735512 | 0.26032 | 0.400556 | 0.398181 | 0.439655 | 0.706103 | 0.348519 | 0.583086 | 0.560848 | 0.42735 | 0.6415 | 0.71225 | 0.58 | 0.573218 | 0.77 | 0.454112 | 0.554822 | 0.436205 |
stage1-step-3240000 | 3,240,000 | 0.639321 | 0.382544 | 0.438108 | 0.490553 | 0.439655 | 0.704399 | 0.387699 | 0.477485 | 0.655672 | 0.445299 | 0.65675 | 0.6875 | 0.557778 | 0.566474 | 0.78 | 0.460288 | 0.578498 | 0.451741 |
stage1-step-3280000 | 3,280,000 | 0.795936 | 0.295507 | 0.394993 | 0.407978 | 0.451593 | 0.744169 | 0.323007 | 0.522834 | 0.58879 | 0.437607 | 0.603 | 0.701 | 0.593333 | 0.570809 | 0.74 | 0.456713 | 0.557151 | 0.419544 |
stage1-step-3320000 | 3,320,000 | 0.7976 | 0.411679 | 0.471488 | 0.491253 | 0.439655 | 0.727134 | 0.382688 | 0.591406 | 0.663944 | 0.437607 | 0.60275 | 0.68975 | 0.553333 | 0.574663 | 0.74 | 0.455412 | 0.581603 | 0.453392 |
stage1-step-3360000 | 3,360,000 | 0.829146 | 0.477368 | 0.411683 | 0.483555 | 0.439655 | 0.586308 | 0.37221 | 0.619432 | 0.60536 | 0.442735 | 0.63525 | 0.62825 | 0.565556 | 0.564547 | 0.73 | 0.456171 | 0.586454 | 0.454368 |
stage1-step-3400000 | 3,400,000 | 0.790514 | 0.407029 | 0.44089 | 0.486354 | 0.489924 | 0.647213 | 0.378588 | 0.595285 | 0.542772 | 0.444444 | 0.5905 | 0.70975 | 0.617778 | 0.569846 | 0.75 | 0.457146 | 0.569571 | 0.463149 |
stage1-step-3440000 | 3,440,000 | 0.726279 | 0.327836 | 0.464534 | 0.494752 | 0.439655 | 0.77285 | 0.37631 | 0.275378 | 0.61444 | 0.415385 | 0.554 | 0.68375 | 0.603333 | 0.560212 | 0.76 | 0.455304 | 0.594605 | 0.4645 |
stage2-step-3480000 | 3,480,000 | 0.832353 | 0.435741 | 0.433936 | 0.449965 | 0.544649 | 0.637862 | 0.368109 | 0.626592 | 0.620942 | 0.448718 | 0.67525 | 0.69825 | 0.621111 | 0.570809 | 0.72 | 0.45357 | 0.572676 | 0.479586 |
stage2-step-3520000 | 3,520,000 | 0.857119 | 0.484473 | 0.464534 | 0.53464 | 0.439655 | 0.649823 | 0.397722 | 0.590834 | 0.487034 | 0.450427 | 0.656 | 0.66625 | 0.66 | 0.56262 | 0.74 | 0.454112 | 0.585678 | 0.485965 |
stage2-step-3560000 | 3,560,000 | 0.758367 | 0.433523 | 0.445063 | 0.492652 | 0.51011 | 0.782601 | 0.381321 | 0.532263 | 0.666681 | 0.442735 | 0.5315 | 0.65575 | 0.642222 | 0.561657 | 0.73 | 0.457904 | 0.559868 | 0.467502 |
stage2-step-3600000 | 3,600,000 | 0.66191 | 0.406371 | 0.460362 | 0.46606 | 0.514019 | 0.807912 | 0.358087 | 0.366006 | 0.602822 | 0.453846 | 0.59425 | 0.69225 | 0.661111 | 0.571773 | 0.76 | 0.457471 | 0.547448 | 0.481837 |
stage2-step-3640000 | 3,640,000 | 0.81278 | 0.407889 | 0.432545 | 0.495451 | 0.453335 | 0.862057 | 0.381777 | 0.413495 | 0.503475 | 0.437607 | 0.6785 | 0.69925 | 0.641111 | 0.571291 | 0.78 | 0.448586 | 0.598874 | 0.480636 |
stage2-step-3680000 | 3,680,000 | 0.769723 | 0.440855 | 0.446453 | 0.496151 | 0.439655 | 0.746456 | 0.354442 | 0.531129 | 0.600537 | 0.446154 | 0.68525 | 0.6955 | 0.652222 | 0.56262 | 0.73 | 0.456279 | 0.582185 | 0.494896 |
stage2-step-3720000 | 3,720,000 | 0.674795 | 0.42891 | 0.445063 | 0.498251 | 0.515351 | 0.817538 | 0.377221 | 0.536999 | 0.587123 | 0.432479 | 0.6225 | 0.66325 | 0.638889 | 0.569846 | 0.76 | 0.452053 | 0.529983 | 0.471405 |
stage2-step-3760000 | 3,760,000 | 0.859899 | 0.523712 | 0.44089 | 0.514346 | 0.439655 | 0.652357 | 0.365376 | 0.603967 | 0.515779 | 0.439316 | 0.60975 | 0.6745 | 0.627778 | 0.55973 | 0.8 | 0.456171 | 0.576751 | 0.487992 |
stage2-step-3800000 | 3,800,000 | 0.758184 | 0.375586 | 0.47427 | 0.510147 | 0.439655 | 0.811936 | 0.377677 | 0.392965 | 0.461459 | 0.423932 | 0.64075 | 0.707 | 0.666667 | 0.567437 | 0.73 | 0.452162 | 0.579662 | 0.468028 |
stage2-step-3840000 | 3,840,000 | 0.675416 | 0.453705 | 0.449235 | 0.524843 | 0.452754 | 0.747541 | 0.387244 | 0.550664 | 0.579602 | 0.45812 | 0.622 | 0.7025 | 0.673333 | 0.569846 | 0.8 | 0.456713 | 0.590918 | 0.491069 |
stage2-step-3880000 | 3,880,000 | 0.733882 | 0.458068 | 0.445063 | 0.46676 | 0.439655 | 0.81775 | 0.362187 | 0.508454 | 0.589692 | 0.447009 | 0.608 | 0.693 | 0.664444 | 0.572736 | 0.79 | 0.458121 | 0.582767 | 0.487016 |
stage2-step-3920000 | 3,920,000 | 0.720727 | 0.468473 | 0.453408 | 0.515045 | 0.460655 | 0.835628 | 0.367198 | 0.310558 | 0.593927 | 0.446154 | 0.6405 | 0.67225 | 0.67 | 0.566474 | 0.75 | 0.453137 | 0.53464 | 0.490919 |
stage2-step-3960000 | 3,960,000 | 0.784772 | 0.326961 | 0.475661 | 0.524843 | 0.439655 | 0.716114 | 0.384966 | 0.469384 | 0.599637 | 0.442735 | 0.63025 | 0.698 | 0.657778 | 0.567437 | 0.76 | 0.454437 | 0.558704 | 0.487016 |
stage2-step-4000000 | 4,000,000 | 0.806564 | 0.466738 | 0.468707 | 0.502449 | 0.439655 | 0.777833 | 0.379043 | 0.423805 | 0.583028 | 0.449573 | 0.58675 | 0.6825 | 0.631111 | 0.553468 | 0.73 | 0.455954 | 0.568213 | 0.502702 |
Evaluation Logs on Portuguese Benchmarks for OLMo-2 and SmolLM3
These logs contain benchmark results across a suite of Portuguese-language tasks. The data consists of recordings of the performance of various 3 different models at different checkpoints throughout their pretraining runs:
Splits
Each split (smollm3_3b, olmo2_1b, olmo2_7b) contains rows for model checkpoints and columns for benchmark scores (e.g., ASSIN2 RTE, ENEM, BLUEX, OAB, etc.).
Data Format
The checkpoint column indicates the branch from which the checkpoint was taken (e.g., stage1-step-40000). step indicates the training step at which the checkpoint was saved. The remaining columns show the model's performance across various benchmarks.
{
"Checkpoint": "stage1-step-40000",
"Step": 40000,
"ASSIN2 RTE": 0.5369919728357186,
"ASSIN2 STS": 0.1356046511596823,
"BLUEX": 0.2002781641168289,
"ENEM": 0.1980405878236529,
"FAQUAD NLI": 0.5373977569778228,
"HateBR": 0.5875147596226018,
"OAB": 0.2323462414578587,
"PT Hate Speech": 0.5555178900597524,
"TweetSentBR": 0.3303917299738176,
"ARC Challenge": 0.288034188034188,
"ASSIN2 ENT": 0.58075,
"ASSIN2 PAR": 0.6365,
"BELEBELE": 0.23,
"CALAME": 0.5130057803468208,
"Global PIQA": 0.65,
"HellaSwag": 0.3650449669519991,
"LAMBADA": 0.4723462060935377,
"MMLU": 0.2284599219453617
}
~100 billion tokens separate each checkpoint. Still, due to differences in checkpoint saving frequency and batch size, the actual token counts between checkpoints may vary slightly.
How to Use
from datasets import load_dataset
# Loads the SmolLM3 split
ds = load_dataset("Polygl0t/portuguese-eval-logs-olmo2-smollm3", split="smollm3_3b")
Benchmarks
The benchmarks are sourced from lm-evaluation-harness (branch: polyglot_harness_portuguese) and lm-evaluation-harness-pt, which provide a standardized set of Portuguese-language tasks for LLM evaluation.
The following benchmarks are included in this log:
- ENEM: Brazilian high-school exam, Q&A format (dataset).
- BLUEX: University entrance exam questions (Unicamp/Fuvest), Q&A format (dataset).
- OAB Exams: Brazilian Bar Association exam questions, Q&A format (dataset).
- ASSIN2 RTE: Textual entailment / natural language inference (dataset).
- ASSIN2 STS: Semantic textual similarity (dataset).
- FAQUAD NLI: Entailment task based on Portuguese reading comprehension (dataset).
- HateBR: Abusive language detection in Brazilian Portuguese social media (dataset).
- PT Hate Speech: Hate speech detection in Portuguese tweets (dataset).
- TweetSentBR: Sentiment analysis on Brazilian Portuguese tweets (dataset).
- ARC Challenge: Multiple-choice grade-school science questions (Portuguese translation) (dataset).
- ASSIN2 ENT: Textual entailment (natural language inference), not generative (dataset).
- ASSIN2 PAR: Paraphrase detection from the ASSIN2 dataset (dataset).
- BELEBELE: Multilingual reading comprehension (Portuguese subset)(dataset).
- CALAME: Predict the last word of a passage — Portuguese version (similar to LAMBADA) (dataset).
- Global PIQA: Physical commonsense reasoning (Brazilian Portuguese subset) (dataset).
- HellaSwag: Commonsense inference (Portuguese translation) (dataset).
- LAMBADA: Predict the last word of a passage (Portuguese translation) (dataset).
- MMLU: Multitask language understanding (Portuguese translation) (dataset).
Usage and Purpose
- Benchmark Analysis: Track how model performance evolves during pretraining.
- Evaluation Research: Assess the reliability and signal quality of different benchmarks.
- Model Comparison: Compare Portuguese language understanding across different LLMs and training regimes.
E.g, SmolLM3-3B Performance on the ENEM benchmark across checkpoints:
Other plots can be found in the .plots directory.
Citation Information
@misc{correa2026tucano2cool,
title={{Tucano 2 Cool: Better Open Source LLMs for Portuguese}},
author={Nicholas Kluge Corr{\^e}a and Aniket Sen and Shiza Fatimah and Sophia Falk and Lennard Landgraf and Julia Kastner and Lucie Flek},
year={2026},
eprint={2603.03543},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2603.03543},
}
Acknowledgments
Polyglot is a project funded by the Federal Ministry of Education and Research (BMBF) and the Ministry of Culture and Science of the State of North Rhine-Westphalia (MWK) as part of TRA Sustainable Futures (University of Bonn) and the Excellence Strategy of the federal and state governments.
We also gratefully acknowledge the granted access to the Marvin cluster hosted by University of Bonn along with the support provided by its High Performance Computing & Analytics Lab.
License
All data in this dataset is licensed under the Apache License 2.0.
- Downloads last month
- 53
