schema_version stringclasses 1
value | model dict | evaluation dict | results dict |
|---|---|---|---|
1.0 | {
"name": "Qwen3-Embedding-0.6B",
"model_type": "embedding",
"params": "600M",
"revision": "unknown",
"url": "https://huggingface.co/Qwen3-Embedding-0.6B"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.579734,
"accuracy": 0.60678,
"macro_precision": 0.630595,
"macro_recall": 0.641596
},
"by_task": {
"sentiment": {
"macro_f1": 0.810295,
"accuracy": 0.803694,
"macro_precision": 0.832307,
"macro_recall": 0.837046
},
"emotion": {
... |
1.0 | {
"name": "Qwen3-Embedding-8B",
"model_type": "embedding",
"params": "8B",
"revision": "unknown",
"url": "https://huggingface.co/Qwen3-Embedding-8B"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.591328,
"accuracy": 0.635753,
"macro_precision": 0.700355,
"macro_recall": 0.669397
},
"by_task": {
"sentiment": {
"macro_f1": 0.891563,
"accuracy": 0.886761,
"macro_precision": 0.893797,
"macro_recall": 0.902851
},
"emotion": {
... |
1.0 | {
"name": "all-MiniLM-L6-v2",
"model_type": "embedding",
"params": "22M",
"revision": "unknown",
"url": "https://huggingface.co/all-MiniLM-L6-v2"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.365998,
"accuracy": 0.442739,
"macro_precision": 0.500873,
"macro_recall": 0.450317
},
"by_task": {
"sentiment": {
"macro_f1": 0.347246,
"accuracy": 0.487208,
"macro_precision": 0.503336,
"macro_recall": 0.477323
},
"emotion": {
... |
1.0 | {
"name": "bge-base-en-v1.5",
"model_type": "embedding",
"params": "137M",
"revision": "unknown",
"url": "https://huggingface.co/bge-base-en-v1.5"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.568329,
"accuracy": 0.592612,
"macro_precision": 0.641254,
"macro_recall": 0.640123
},
"by_task": {
"sentiment": {
"macro_f1": 0.819812,
"accuracy": 0.81391,
"macro_precision": 0.852102,
"macro_recall": 0.856473
},
"emotion": {
... |
1.0 | {
"name": "bge-large-en-v1.5",
"model_type": "embedding",
"params": "434M",
"revision": "unknown",
"url": "https://huggingface.co/bge-large-en-v1.5"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.554848,
"accuracy": 0.589174,
"macro_precision": 0.651518,
"macro_recall": 0.637449
},
"by_task": {
"sentiment": {
"macro_f1": 0.839938,
"accuracy": 0.830804,
"macro_precision": 0.866493,
"macro_recall": 0.869504
},
"emotion": {
... |
1.0 | {
"name": "e5-base-v2",
"model_type": "embedding",
"params": "110M",
"revision": "unknown",
"url": "https://huggingface.co/e5-base-v2"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.596597,
"accuracy": 0.621787,
"macro_precision": 0.644211,
"macro_recall": 0.653966
},
"by_task": {
"sentiment": {
"macro_f1": 0.833574,
"accuracy": 0.827718,
"macro_precision": 0.845933,
"macro_recall": 0.857147
},
"emotion": {
... |
1.0 | {
"name": "e5-large-v2",
"model_type": "embedding",
"params": "335M",
"revision": "unknown",
"url": "https://huggingface.co/e5-large-v2"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.597409,
"accuracy": 0.616721,
"macro_precision": 0.652853,
"macro_recall": 0.65146
},
"by_task": {
"sentiment": {
"macro_f1": 0.855361,
"accuracy": 0.843479,
"macro_precision": 0.877192,
"macro_recall": 0.880996
},
"emotion": {
... |
1.0 | {
"name": "e5-mistral-7b-instruct",
"model_type": "embedding",
"params": "7B",
"revision": "unknown",
"url": "https://huggingface.co/e5-mistral-7b-instruct"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.575567,
"accuracy": 0.615418,
"macro_precision": 0.689792,
"macro_recall": 0.662082
},
"by_task": {
"sentiment": {
"macro_f1": 0.872693,
"accuracy": 0.864698,
"macro_precision": 0.882869,
"macro_recall": 0.893214
},
"emotion": {
... |
1.0 | {
"name": "gte-base-en-v1.5",
"model_type": "embedding",
"params": "137M",
"revision": "unknown",
"url": "https://huggingface.co/gte-base-en-v1.5"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.584158,
"accuracy": 0.607613,
"macro_precision": 0.649384,
"macro_recall": 0.653511
},
"by_task": {
"sentiment": {
"macro_f1": 0.825544,
"accuracy": 0.817247,
"macro_precision": 0.857819,
"macro_recall": 0.85287
},
"emotion": {
... |
1.0 | {
"name": "gte-large-en-v1.5",
"model_type": "embedding",
"params": "434M",
"revision": "unknown",
"url": "https://huggingface.co/gte-large-en-v1.5"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.617418,
"accuracy": 0.637108,
"macro_precision": 0.668635,
"macro_recall": 0.680452
},
"by_task": {
"sentiment": {
"macro_f1": 0.849181,
"accuracy": 0.846062,
"macro_precision": 0.864922,
"macro_recall": 0.882493
},
"emotion": {
... |
1.0 | {
"name": "gte-modernbert-base",
"model_type": "embedding",
"params": "149M",
"revision": "unknown",
"url": "https://huggingface.co/gte-modernbert-base"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.58639,
"accuracy": 0.609029,
"macro_precision": 0.655794,
"macro_recall": 0.6503
},
"by_task": {
"sentiment": {
"macro_f1": 0.867424,
"accuracy": 0.866605,
"macro_precision": 0.875204,
"macro_recall": 0.870719
},
"emotion": {
"... |
1.0 | {
"name": "Llama-3.2-3B-Instruct",
"model_type": "llm",
"params": "3.2B",
"revision": "unknown",
"url": "https://huggingface.co/Llama-3.2-3B-Instruct"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.430162,
"accuracy": 0.463103,
"macro_precision": 0.505333,
"macro_recall": 0.477962
},
"by_task": {
"sentiment": {
"macro_f1": 0.455847,
"accuracy": 0.48471,
"macro_precision": 0.495721,
"macro_recall": 0.493113
},
"emotion": {
... |
1.0 | {
"name": "Mistral-Nemo-Instruct-2407",
"model_type": "llm",
"params": "12.2B",
"revision": "unknown",
"url": "https://huggingface.co/Mistral-Nemo-Instruct-2407"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.669733,
"accuracy": 0.712022,
"macro_precision": 0.726725,
"macro_recall": 0.693765
},
"by_task": {
"sentiment": {
"macro_f1": 0.841272,
"accuracy": 0.846579,
"macro_precision": 0.887509,
"macro_recall": 0.860019
},
"emotion": {
... |
1.0 | {
"name": "Phi-4-mini-instruct",
"model_type": "llm",
"params": "3.8B",
"revision": "unknown",
"url": "https://huggingface.co/Phi-4-mini-instruct"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.430866,
"accuracy": 0.46549,
"macro_precision": 0.560353,
"macro_recall": 0.496716
},
"by_task": {
"sentiment": {
"macro_f1": 0.490137,
"accuracy": 0.53863,
"macro_precision": 0.639548,
"macro_recall": 0.56505
},
"emotion": {
"... |
1.0 | {
"name": "Qwen3-4B",
"model_type": "llm",
"params": "4B",
"revision": "unknown",
"url": "https://huggingface.co/Qwen3-4B"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.648575,
"accuracy": 0.696985,
"macro_precision": 0.710996,
"macro_recall": 0.692165
},
"by_task": {
"sentiment": {
"macro_f1": 0.88324,
"accuracy": 0.875612,
"macro_precision": 0.894227,
"macro_recall": 0.89537
},
"emotion": {
... |
1.0 | {
"name": "Qwen3-8B",
"model_type": "llm",
"params": "8.2B",
"revision": "unknown",
"url": "https://huggingface.co/Qwen3-8B"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.664912,
"accuracy": 0.709153,
"macro_precision": 0.699978,
"macro_recall": 0.703688
},
"by_task": {
"sentiment": {
"macro_f1": 0.89867,
"accuracy": 0.899727,
"macro_precision": 0.900424,
"macro_recall": 0.914296
},
"emotion": {
... |
1.0 | {
"name": "gemma-3-1b-it",
"model_type": "llm",
"params": "1B",
"revision": "unknown",
"url": "https://huggingface.co/gemma-3-1b-it"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.359127,
"accuracy": 0.400456,
"macro_precision": 0.464795,
"macro_recall": 0.430821
},
"by_task": {
"sentiment": {
"macro_f1": 0.519361,
"accuracy": 0.558006,
"macro_precision": 0.642945,
"macro_recall": 0.588836
},
"emotion": {
... |
1.0 | {
"name": "gemma-3-270m-it",
"model_type": "llm",
"params": "270M",
"revision": "unknown",
"url": "https://huggingface.co/gemma-3-270m-it"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.279441,
"accuracy": 0.314287,
"macro_precision": 0.341366,
"macro_recall": 0.340519
},
"by_task": {
"sentiment": {
"macro_f1": 0.42075,
"accuracy": 0.46572,
"macro_precision": 0.476907,
"macro_recall": 0.472474
},
"emotion": {
... |
1.0 | {
"name": "bart-large-mnli",
"model_type": "nli",
"params": "407M",
"revision": "unknown",
"url": "https://huggingface.co/bart-large-mnli"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.507865,
"accuracy": 0.532448,
"macro_precision": 0.656902,
"macro_recall": 0.583081
},
"by_task": {
"sentiment": {
"macro_f1": 0.839767,
"accuracy": 0.83231,
"macro_precision": 0.871047,
"macro_recall": 0.874735
},
"emotion": {
... |
1.0 | {
"name": "bert-base-uncased-nli",
"model_type": "nli",
"params": "110M",
"revision": "unknown",
"url": "https://huggingface.co/bert-base-uncased-nli"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.48755,
"accuracy": 0.50482,
"macro_precision": 0.560883,
"macro_recall": 0.541221
},
"by_task": {
"sentiment": {
"macro_f1": 0.760962,
"accuracy": 0.756506,
"macro_precision": 0.794178,
"macro_recall": 0.79018
},
"emotion": {
"... |
1.0 | {
"name": "bert-large-uncased-nli-triplet",
"model_type": "nli",
"params": "335M",
"revision": "unknown",
"url": "https://huggingface.co/bert-large-uncased-nli-triplet"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.524601,
"accuracy": 0.553299,
"macro_precision": 0.580262,
"macro_recall": 0.580901
},
"by_task": {
"sentiment": {
"macro_f1": 0.776186,
"accuracy": 0.781098,
"macro_precision": 0.800591,
"macro_recall": 0.801062
},
"emotion": {
... |
1.0 | {
"name": "bert-large-uncased-nli",
"model_type": "nli",
"params": "335M",
"revision": "unknown",
"url": "https://huggingface.co/bert-large-uncased-nli"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.533903,
"accuracy": 0.572794,
"macro_precision": 0.599148,
"macro_recall": 0.576039
},
"by_task": {
"sentiment": {
"macro_f1": 0.790287,
"accuracy": 0.794577,
"macro_precision": 0.804526,
"macro_recall": 0.811546
},
"emotion": {
... |
1.0 | {
"name": "deberta-v3-base-nli",
"model_type": "nli",
"params": "184M",
"revision": "unknown",
"url": "https://huggingface.co/deberta-v3-base-nli"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.550391,
"accuracy": 0.581439,
"macro_precision": 0.620075,
"macro_recall": 0.608686
},
"by_task": {
"sentiment": {
"macro_f1": 0.858712,
"accuracy": 0.857776,
"macro_precision": 0.859484,
"macro_recall": 0.879552
},
"emotion": {
... |
1.0 | {
"name": "deberta-v3-large-nli-triplet",
"model_type": "nli",
"params": "434M",
"revision": "unknown",
"url": "https://huggingface.co/deberta-v3-large-nli-triplet"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.595813,
"accuracy": 0.619343,
"macro_precision": 0.666545,
"macro_recall": 0.647509
},
"by_task": {
"sentiment": {
"macro_f1": 0.898852,
"accuracy": 0.903925,
"macro_precision": 0.898032,
"macro_recall": 0.90085
},
"emotion": {
... |
1.0 | {
"name": "deberta-v3-large-nli",
"model_type": "nli",
"params": "434M",
"revision": "unknown",
"url": "https://huggingface.co/deberta-v3-large-nli"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.591105,
"accuracy": 0.618846,
"macro_precision": 0.674781,
"macro_recall": 0.64852
},
"by_task": {
"sentiment": {
"macro_f1": 0.897813,
"accuracy": 0.901768,
"macro_precision": 0.896349,
"macro_recall": 0.904284
},
"emotion": {
... |
1.0 | {
"name": "modernbert-base-nli",
"model_type": "nli",
"params": "149M",
"revision": "unknown",
"url": "https://huggingface.co/modernbert-base-nli"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.53426,
"accuracy": 0.563121,
"macro_precision": 0.607563,
"macro_recall": 0.579765
},
"by_task": {
"sentiment": {
"macro_f1": 0.835215,
"accuracy": 0.838109,
"macro_precision": 0.850589,
"macro_recall": 0.842525
},
"emotion": {
... |
1.0 | {
"name": "modernbert-large-nli-triplet",
"model_type": "nli",
"params": "395M",
"revision": "unknown",
"url": "https://huggingface.co/modernbert-large-nli-triplet"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.548722,
"accuracy": 0.581244,
"macro_precision": 0.637688,
"macro_recall": 0.597926
},
"by_task": {
"sentiment": {
"macro_f1": 0.876335,
"accuracy": 0.890394,
"macro_precision": 0.892096,
"macro_recall": 0.870178
},
"emotion": {
... |
1.0 | {
"name": "modernbert-large-nli",
"model_type": "nli",
"params": "395M",
"revision": "unknown",
"url": "https://huggingface.co/modernbert-large-nli"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.551713,
"accuracy": 0.592935,
"macro_precision": 0.640183,
"macro_recall": 0.603994
},
"by_task": {
"sentiment": {
"macro_f1": 0.857985,
"accuracy": 0.884428,
"macro_precision": 0.899025,
"macro_recall": 0.852799
},
"emotion": {
... |
1.0 | {
"name": "nli-roberta-base",
"model_type": "nli",
"params": "125M",
"revision": "unknown",
"url": "https://huggingface.co/nli-roberta-base"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.48849,
"accuracy": 0.504002,
"macro_precision": 0.579458,
"macro_recall": 0.549751
},
"by_task": {
"sentiment": {
"macro_f1": 0.800026,
"accuracy": 0.791339,
"macro_precision": 0.820028,
"macro_recall": 0.825161
},
"emotion": {
... |
1.0 | {
"name": "Qwen3-Reranker-0.6B",
"model_type": "reranker",
"params": "600M",
"revision": "unknown",
"url": "https://huggingface.co/Qwen3-Reranker-0.6B"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.605598,
"accuracy": 0.642526,
"macro_precision": 0.65409,
"macro_recall": 0.669456
},
"by_task": {
"sentiment": {
"macro_f1": 0.80429,
"accuracy": 0.798427,
"macro_precision": 0.841131,
"macro_recall": 0.838923
},
"emotion": {
... |
1.0 | {
"name": "Qwen3-Reranker-8B",
"model_type": "reranker",
"params": "8B",
"revision": "unknown",
"url": "https://huggingface.co/Qwen3-Reranker-8B"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.72244,
"accuracy": 0.764832,
"macro_precision": 0.763814,
"macro_recall": 0.763002
},
"by_task": {
"sentiment": {
"macro_f1": 0.92305,
"accuracy": 0.927717,
"macro_precision": 0.929232,
"macro_recall": 0.919381
},
"emotion": {
... |
1.0 | {
"name": "bge-reranker-base",
"model_type": "reranker",
"params": "278M",
"revision": "unknown",
"url": "https://huggingface.co/bge-reranker-base"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.471124,
"accuracy": 0.491962,
"macro_precision": 0.52378,
"macro_recall": 0.529067
},
"by_task": {
"sentiment": {
"macro_f1": 0.620593,
"accuracy": 0.621526,
"macro_precision": 0.637115,
"macro_recall": 0.646325
},
"emotion": {
... |
1.0 | {
"name": "bge-reranker-large",
"model_type": "reranker",
"params": "560M",
"revision": "unknown",
"url": "https://huggingface.co/bge-reranker-large"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.534818,
"accuracy": 0.556759,
"macro_precision": 0.610169,
"macro_recall": 0.602469
},
"by_task": {
"sentiment": {
"macro_f1": 0.781011,
"accuracy": 0.780451,
"macro_precision": 0.788273,
"macro_recall": 0.79803
},
"emotion": {
... |
1.0 | {
"name": "gte-reranker-modernbert-base",
"model_type": "reranker",
"params": "149M",
"revision": "unknown",
"url": "https://huggingface.co/gte-reranker-modernbert-base"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.57849,
"accuracy": 0.616544,
"macro_precision": 0.625188,
"macro_recall": 0.61616
},
"by_task": {
"sentiment": {
"macro_f1": 0.82026,
"accuracy": 0.818163,
"macro_precision": 0.853517,
"macro_recall": 0.846494
},
"emotion": {
"... |
1.0 | {
"name": "ms-marco-MiniLM-L6-v2",
"model_type": "reranker",
"params": "22M",
"revision": "unknown",
"url": "https://huggingface.co/ms-marco-MiniLM-L6-v2"
} | {
"btzsc_version": "0.1.1",
"btzsc_commit": "8a0d52cbe423",
"timestamp": "2026-03-01T19:47:17.282165+00:00",
"device": "unknown",
"precision": "unknown",
"batch_size": 32,
"max_samples": null
} | {
"overall": {
"macro_f1": 0.421785,
"accuracy": 0.460398,
"macro_precision": 0.486364,
"macro_recall": 0.456056
},
"by_task": {
"sentiment": {
"macro_f1": 0.58518,
"accuracy": 0.590793,
"macro_precision": 0.625615,
"macro_recall": 0.62621
},
"emotion": {
... |
BTZSC Results
This repository stores model submissions for the BTZSC leaderboard.
BTZSC: A Benchmark for Zero-Shot Text Classification across Cross-Encoders, Embedding Models, Rerankers and LLMs.
- Paper: https://openreview.net/forum?id=IxMryAz2p3
- Eval harness: https://github.com/IliasAarab/btzsc
- Leaderboard Space: https://huggingface.co/spaces/btzsc/btzsc-leaderboard
Benchmark summary:
- 22 English single-label datasets
- 4 task families: sentiment, topic, intent, emotion
- Strict zero-shot protocol (no BTZSC-label training/tuning)
- Primary metric: macro-F1
What this repo contains
- One JSON file per model evaluation run in
results/<model_type>/<model-name>.json - Reproducibility metadata (BTZSC version, commit, precision, batch size)
- Full per-dataset metrics for all 22 BTZSC datasets
Schema
Each submission follows schema version 1.0 with:
model: model id, type, parameter count, revisionevaluation: harness versioning and runtime metadataresults.overall: averaged macro F1 / accuracy / macro precision / macro recallresults.by_task: sentiment/topic/intent/emotion aggregatesresults.by_dataset: per-dataset metric blocks
Contributing results
Destination path format:
results/<model_type>/<model-name>.json
Recommended flow:
- Export with the official harness (
btzsc evaluate ... --output-json ...). - Validate locally (
python validate.py results/<model_type>/<model-name>.json). - Add your file at the required path.
- Submit by one of these methods:
- Web UI upload on Hugging Face (no clone required)
- Git workflow (direct push if you have write access, otherwise fork + PR)
- API workflow via
huggingface_hubwithcreate_pr=True(PR-based)
In short: add means placing the JSON at the correct path; submit means publishing that change to this remote repo.
See SUBMISSION.md for full requirements and review checks.
PRs adding result files are validated in CI with validate.py.
- Downloads last month
- 5,918