model_id stringclasses 4
values | tier stringclasses 4
values | keep_frac float64 0.48 1 | method stringclasses 2
values | scores dict | timestamp timestamp[s]date 2026-04-06 19:33:28 2026-04-06 19:33:28 | base_model stringclasses 1
value |
|---|---|---|---|---|---|---|
mistralai/Mistral-7B-v0.1 | baseline | 1 | none | {
"arc_challenge": 54.18,
"arc_easy": 79.5,
"boolq": 84.22,
"gsm8k": 37.91,
"hellaswag": 81.23,
"mmlu": 60,
"openbookqa": 46,
"truthfulqa_mc2": 42.11,
"winogrande": 74.98
} | 2026-04-06T19:33:28 | null |
dystrio/Mistral-7B-v0.1-Sculpt-Default | default | 0.95 | sculpt_structural | {
"arc_challenge": 45.22,
"arc_easy": 73.06,
"boolq": 65.5,
"gsm8k": 23.73,
"hellaswag": 77.78,
"mmlu": 52.23,
"openbookqa": 43.4,
"truthfulqa_mc2": 40.88,
"winogrande": 72.06
} | 2026-04-06T19:33:28 | mistralai/Mistral-7B-v0.1 |
dystrio/Mistral-7B-v0.1-Sculpt-Production | production | 0.78 | sculpt_structural | {
"arc_challenge": 35.41,
"arc_easy": 59.64,
"boolq": 59.48,
"gsm8k": 3.49,
"hellaswag": 63.1,
"mmlu": 39.32,
"openbookqa": 34.8,
"truthfulqa_mc2": 44.36,
"winogrande": 65.75
} | 2026-04-06T19:33:28 | mistralai/Mistral-7B-v0.1 |
dystrio/Mistral-7B-v0.1-Sculpt-Throughput | throughput | 0.48 | sculpt_structural | {
"arc_challenge": 27.39,
"arc_easy": 46.04,
"boolq": 68.93,
"gsm8k": 1.36,
"hellaswag": 46.88,
"mmlu": 29.77,
"openbookqa": 30.2,
"truthfulqa_mc2": 42.18,
"winogrande": 57.77
} | 2026-04-06T19:33:28 | mistralai/Mistral-7B-v0.1 |
No dataset card yet
- Downloads last month
- 25