reproducing-cross-encoders
Collection
A set of cross-encoders trained from various backbones and losses for equal comparison • 55 items • Updated • 4
This model is a cross-encoder based on jhu-clsp/ettin-encoder-150m. It was trained on Ms-Marco using loss marginMSE as part of a reproducibility paper for training cross encoders: "Reproducing and Comparing Distillation Techniques for Cross-Encoders", see the paper for more details.
This model is intended for re-ranking the top results returned by a retrieval system (like BM25, Bi-Encoders or SPLADE).
Training can be easily reproduced using the assiciated repository. The exact training configuration used for this model is also detailed in config.yaml.
Quick Start:
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
tokenizer = AutoTokenizer.from_pretrained("xpmir/cross-encoder-ettin-150m-MarginMSE")
model = AutoModelForSequenceClassification.from_pretrained("xpmir/cross-encoder-ettin-150m-MarginMSE")
features = tokenizer("What is experimaestro ?", "Experimaestro is a powerful framework for ML experiments management...", padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
print(scores)
We provide evaluations of this cross-encoder re-ranking the top 1000 documents retrieved by naver/splade-v3-distilbert.
| dataset | RR@10 | nDCG@10 |
|---|---|---|
| msmarco_dev | 40.64 | 47.28 |
| trec2019 | 95.23 | 73.77 |
| trec2020 | 94.75 | 73.89 |
| fever | 83.48 | 82.97 |
| arguana | 24.04 | 35.30 |
| climate_fever | 36.86 | 27.26 |
| dbpedia | 78.60 | 48.26 |
| fiqa | 48.22 | 40.32 |
| hotpotqa | 90.50 | 74.92 |
| nfcorpus | 56.81 | 35.59 |
| nq | 55.57 | 60.48 |
| quora | 80.97 | 82.83 |
| scidocs | 29.38 | 16.70 |
| scifact | 70.00 | 73.05 |
| touche | 59.89 | 34.80 |
| trec_covid | 93.83 | 75.79 |
| robust04 | 70.15 | 48.62 |
| lotte_writing | 74.20 | 64.82 |
| lotte_recreation | 62.97 | 57.69 |
| lotte_science | 49.32 | 41.38 |
| lotte_technology | 57.70 | 49.28 |
| lotte_lifestyle | 75.63 | 65.67 |
| Mean In Domain | 76.87 | 64.98 |
| BEIR 13 | 62.17 | 52.94 |
| LoTTE (OOD) | 65.00 | 54.58 |
Base model
jhu-clsp/ettin-encoder-150m