| --- |
| license: apache-2.0 |
| base_model: |
| - meta-llama/Llama-3.1-8B-Instruct |
| language: |
| - zh |
| - en |
| --- |
| # CSLM-base |
|
|
| ## 💡 Model Description |
|
|
| `CSLM-base` is a base cross-lingual speech language model checkpoint from the ACL 2026 Findings project *Efficient Training for Cross-lingual Speech Language Models*. |
| It is designed for speech-to-speech / speech-conditioned generation workflows where speech units are used as model input. |
|
|
| Paper: https://arxiv.org/abs/2604.11096 |
|
|
| ## 📖 Citation |
|
|
| If you use this model, please cite: |
|
|
| ```bibtex |
| @misc{zhou2026efficienttrainingcrosslingualspeech, |
| title={Efficient Training for Cross-lingual Speech Language Models}, |
| author={Yan Zhou and Qingkai Fang and Yun Hong and Yang Feng}, |
| year={2026}, |
| eprint={2604.11096}, |
| archivePrefix={arXiv}, |
| primaryClass={cs.CL}, |
| url={https://arxiv.org/abs/2604.11096}, |
| } |
| ``` |
|
|
| ## ✉️ Contact |
|
|
| For questions, contact: `zhouyan23z@ict.ac.cn` |