Add library_name metadata
#1
by nielsr HF Staff - opened
README.md
CHANGED
|
@@ -1,13 +1,14 @@
|
|
| 1 |
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
language:
|
| 4 |
- en
|
| 5 |
- zh
|
|
|
|
| 6 |
pipeline_tag: text-generation
|
|
|
|
| 7 |
---
|
| 8 |
|
| 9 |
# DECO-0.5B
|
| 10 |
-
This is the 0.5B DECO checkpoint introduced by the paper
|
| 11 |
|
| 12 |
Links: [[Paper](https://arxiv.org/pdf/2605.10933)] [[Code](https://github.com/thunlp/DECO)]
|
| 13 |
|
|
@@ -34,7 +35,7 @@ print(tokenizer.decode(output[0], skip_special_tokens=True))
|
|
| 34 |
|
| 35 |
### Citation
|
| 36 |
If you find our work useful for your research, please kindly cite our paper as follows:
|
| 37 |
-
```
|
| 38 |
@article{song2026deco,
|
| 39 |
title={{DECO}: Sparse Mixture-of-Experts with Dense-Comparable Performance on End-Side Devices},
|
| 40 |
author={Chenyang Song, Weilin Zhao, Xu Han, Chaojun Xiao, Yingfa Chen, Zhiyuan Liu},
|
|
|
|
| 1 |
---
|
|
|
|
| 2 |
language:
|
| 3 |
- en
|
| 4 |
- zh
|
| 5 |
+
license: apache-2.0
|
| 6 |
pipeline_tag: text-generation
|
| 7 |
+
library_name: transformers
|
| 8 |
---
|
| 9 |
|
| 10 |
# DECO-0.5B
|
| 11 |
+
This is the 0.5B DECO checkpoint introduced by the paper [DECO: Sparse Mixture-of-Experts with Dense-Comparable Performance on End-Side Devices](https://huggingface.co/papers/2605.10933). DECO is an improved version of our previous [BlockFFN](https://arxiv.org/pdf/2507.08771) architecture, with dense-comparable performance given the same budget of total parameters.
|
| 12 |
|
| 13 |
Links: [[Paper](https://arxiv.org/pdf/2605.10933)] [[Code](https://github.com/thunlp/DECO)]
|
| 14 |
|
|
|
|
| 35 |
|
| 36 |
### Citation
|
| 37 |
If you find our work useful for your research, please kindly cite our paper as follows:
|
| 38 |
+
```bibtex
|
| 39 |
@article{song2026deco,
|
| 40 |
title={{DECO}: Sparse Mixture-of-Experts with Dense-Comparable Performance on End-Side Devices},
|
| 41 |
author={Chenyang Song, Weilin Zhao, Xu Han, Chaojun Xiao, Yingfa Chen, Zhiyuan Liu},
|