Model Card for coref-llm-CRAC26-multilingual

This model is a fine-tuned version of google/gemma-3-27b-it. It has been trained using TRL.

CRAC 2026 Shared Task โ€” Test Set Results

Dataset F1 Score
cs_pcedt 72.79
hu_korkor 60.54
en_gum 73.88
la_coreflat 33.56
cs_pdt 76.31
no_bokmaalnarc 81.19
cs_pdtsc 70.89
fr_democrat 56.76
ca_ancora 78.00
pl_pcc 78.35
cu_proiel 61.32
nl_openboek 77.80
hi_hdtb 73.74
hu_szegedkoref 62.81
ko_ecmt 68.10
hbo_ptnk 75.65
lt_lcc 60.45
tr_itcc 68.39
grc_proiel 74.11
fr_ancor 77.33
en_litbank 81.56
no_nynorsknarc 79.35
es_ancora 78.30
fr_litbankfr 73.34
de_potsdamcc 70.87
en_fantasycoref 78.47
ru_rucor 81.99
Average 71.33

Evaluation on the official CRAC 2026 shared task test set.

Training procedure

This model was trained with SFT.

Framework versions

  • PEFT 0.18.1
  • TRL: 0.29.0
  • Transformers: 5.2.0
  • Pytorch: 2.10.0+cu126
  • Datasets: 4.6.1
  • Tokenizers: 0.22.2
Downloads last month
45
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for lattice-nlp/coref-llm-CRAC26-multilingual

Adapter
(219)
this model

Collection including lattice-nlp/coref-llm-CRAC26-multilingual