Safetensors
Chinese
English
llama

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

CSLM-SFT

πŸ’‘ Model Description

CSLM-SFT is a supervised fine-tuned checkpoint of CSLM from the ACL 2026 Findings project Efficient Training for Cross-lingual Speech Language Models.

It is built on top of the CSLM-base model and further tuned with instruction-style data for speech-to-speech conversation, including monolingual and cross-lingual scenarios.

Paper: https://arxiv.org/abs/2604.11096

πŸ“– Citation

If you use this model, please cite:

@misc{zhou2026efficienttrainingcrosslingualspeech,
      title={Efficient Training for Cross-lingual Speech Language Models}, 
      author={Yan Zhou and Qingkai Fang and Yun Hong and Yang Feng},
      year={2026},
      eprint={2604.11096},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2604.11096}, 
}

βœ‰οΈ Contact

For questions, contact: zhouyan23z@ict.ac.cn

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for ICTNLP/CSLM-SFT

Finetuned
ICTNLP/CSLM-base
Finetuned
(1)
this model

Collection including ICTNLP/CSLM-SFT

Paper for ICTNLP/CSLM-SFT