Chinese
English
llama

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

CSLM-base

πŸ’‘ Model Description

CSLM-base is a base cross-lingual speech language model checkpoint from the ACL 2026 Findings project Efficient Training for Cross-lingual Speech Language Models.
It is designed for speech-to-speech / speech-conditioned generation workflows where speech units are used as model input.

Paper: https://arxiv.org/abs/2604.11096

πŸ“– Citation

If you use this model, please cite:

@misc{zhou2026efficienttrainingcrosslingualspeech,
      title={Efficient Training for Cross-lingual Speech Language Models}, 
      author={Yan Zhou and Qingkai Fang and Yun Hong and Yang Feng},
      year={2026},
      eprint={2604.11096},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2604.11096}, 
}

βœ‰οΈ Contact

For questions, contact: zhouyan23z@ict.ac.cn

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for ICTNLP/CSLM-base

Finetuned
(2463)
this model
Finetunes
1 model

Collection including ICTNLP/CSLM-base

Paper for ICTNLP/CSLM-base