BGE Reranker v2-m3 - Multi-domain RAG Fine-tuned
Fine-tuned version of BAAI/bge-reranker-v2-m3 for multi-domain reranking.
Training Details
- Base model: BAAI/bge-reranker-v2-m3
- Training strategy: Pairwise learning (1:2 ratio)
- Hard negatives: BM25
- Epochs: 3
Usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
tokenizer = AutoTokenizer.from_pretrained("pedrovo9/bge-reranker-v2-m3-multirag-finetuned")
model = AutoModelForSequenceClassification.from_pretrained("pedrovo9/bge-reranker-v2-m3-multirag-finetuned")
query = "What is cloud computing?"
document = "Cloud computing is..."
inputs = tokenizer(query, document, return_tensors='pt', truncation=True, max_length=512)
with torch.no_grad():
score = torch.sigmoid(model(**inputs).logits[0, 0]).item()
- Downloads last month
- 2
Model tree for pedrovo9/bge-reranker-v2-m3-multirag-finetuned
Base model
BAAI/bge-reranker-v2-m3