coref-llm
Collection
Coreference Resolution with LLM โข 1 item โข Updated โข 1
This model is a fine-tuned version of google/gemma-3-27b-it. It has been trained using TRL.
| Dataset | F1 Score |
|---|---|
| cs_pcedt | 72.79 |
| hu_korkor | 60.54 |
| en_gum | 73.88 |
| la_coreflat | 33.56 |
| cs_pdt | 76.31 |
| no_bokmaalnarc | 81.19 |
| cs_pdtsc | 70.89 |
| fr_democrat | 56.76 |
| ca_ancora | 78.00 |
| pl_pcc | 78.35 |
| cu_proiel | 61.32 |
| nl_openboek | 77.80 |
| hi_hdtb | 73.74 |
| hu_szegedkoref | 62.81 |
| ko_ecmt | 68.10 |
| hbo_ptnk | 75.65 |
| lt_lcc | 60.45 |
| tr_itcc | 68.39 |
| grc_proiel | 74.11 |
| fr_ancor | 77.33 |
| en_litbank | 81.56 |
| no_nynorsknarc | 79.35 |
| es_ancora | 78.30 |
| fr_litbankfr | 73.34 |
| de_potsdamcc | 70.87 |
| en_fantasycoref | 78.47 |
| ru_rucor | 81.99 |
| Average | 71.33 |
Evaluation on the official CRAC 2026 shared task test set.
This model was trained with SFT.