Add organization card with logo, model + dataset index, citation
Browse files- .gitattributes +1 -0
- README.md +82 -5
- logo.gif +3 -0
.gitattributes
CHANGED
|
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
logo.gif filter=lfs diff=lfs merge=lfs -text
|
README.md
CHANGED
|
@@ -1,10 +1,87 @@
|
|
| 1 |
---
|
| 2 |
-
title:
|
| 3 |
-
emoji:
|
| 4 |
-
colorFrom:
|
| 5 |
-
colorTo:
|
| 6 |
sdk: static
|
| 7 |
pinned: false
|
| 8 |
---
|
| 9 |
|
| 10 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
+
title: TIDE-dllm
|
| 3 |
+
emoji: π
|
| 4 |
+
colorFrom: blue
|
| 5 |
+
colorTo: indigo
|
| 6 |
sdk: static
|
| 7 |
pinned: false
|
| 8 |
---
|
| 9 |
|
| 10 |
+
<center>
|
| 11 |
+
<img src="logo.gif" width="320" />
|
| 12 |
+
</center>
|
| 13 |
+
|
| 14 |
+
<h1 align="center">Turning the TIDE: Cross-Architecture Distillation for Diffusion Large Language Models</h1>
|
| 15 |
+
|
| 16 |
+
<p align="center">
|
| 17 |
+
π The first cross-architecture distillation framework for diffusion LLMs β distilling 8B dense and 16B MoE teachers into a 0.6B student π
|
| 18 |
+
</p>
|
| 19 |
+
|
| 20 |
+
<p align="center">
|
| 21 |
+
<a href="https://arxiv.org/abs/2604.26951">π arXiv 2604.26951</a> Β·
|
| 22 |
+
<a href="https://github.com/PKU-YuanGroup/TIDE">π» Code (PKU-YuanGroup/TIDE)</a> Β·
|
| 23 |
+
<a href="https://pku-yuangroup.github.io/TIDE-Page/">π Project page</a>
|
| 24 |
+
</p>
|
| 25 |
+
|
| 26 |
+
---
|
| 27 |
+
|
| 28 |
+
This organization hosts the **distilled student checkpoints** and **pre-tokenized SFT datasets** released with TIDE. The framework consists of three modular components β **TIDAL** (dual-axis interpolation), **CompDemo** (complementary mask-split teacher inference), and **Reverse CALM** (cross-tokenizer chunk-level matching) β and is evaluated across two heterogeneous distillation pipelines.
|
| 29 |
+
|
| 30 |
+
## β¨ Highlights
|
| 31 |
+
|
| 32 |
+
- **+1.53 average gain** over the non-distilled BD3LM baseline across 8 benchmarks (34.20 vs. 32.67).
|
| 33 |
+
- **+16.48 on HumanEval** over the equivalent-size AR baseline (48.78 vs. 32.30) β distilled dLLMs especially excel at code generation.
|
| 34 |
+
- **22Γ peak-memory reduction** vs. the 16B MoE LLaDA2 teacher (1.4 GB vs. 31.3 GB) and **5.2Γ faster inference** (6.25 s vs. 32.55 s for 256 tokens on H100), enabling commodity-hardware deployment.
|
| 35 |
+
|
| 36 |
+
## π€ Released models
|
| 37 |
+
|
| 38 |
+
Six 0.6B distilled student checkpoints (3 per pipeline). Each is initialized from [`dllm-hub/Qwen3-0.6B-diffusion-bd3lm-v0.1`](https://huggingface.co/dllm-hub/Qwen3-0.6B-diffusion-bd3lm-v0.1) and distilled from a larger dLLM teacher.
|
| 39 |
+
|
| 40 |
+
| Pipeline | Variant | Repo |
|
| 41 |
+
|---|---|---|
|
| 42 |
+
| A β Cross-Tokenizer (LLaDA2 teacher) | **TIDE-Cross** (native, paper-best) | [distill-LLaDA2-TIDE_Cross](https://huggingface.co/TIDE-dllm/distill-LLaDA2-TIDE_Cross) |
|
| 43 |
+
| A β Cross-Tokenizer (LLaDA2 teacher) | TIDE-Shared variant | [distill-LLaDA2-TIDE_Shared](https://huggingface.co/TIDE-dllm/distill-LLaDA2-TIDE_Shared) |
|
| 44 |
+
| A β Cross-Tokenizer (LLaDA2 teacher) | CALM baseline | [distill-LLaDA2-CALM](https://huggingface.co/TIDE-dllm/distill-LLaDA2-CALM) |
|
| 45 |
+
| B β Shared-Tokenizer (WeDLM teacher) | **TIDE-Shared** (native, paper-best) | [distill-WeDLM-TIDE_Shared](https://huggingface.co/TIDE-dllm/distill-WeDLM-TIDE_Shared) |
|
| 46 |
+
| B β Shared-Tokenizer (WeDLM teacher) | TIDE-Cross variant | [distill-WeDLM-TIDE_Cross](https://huggingface.co/TIDE-dllm/distill-WeDLM-TIDE_Cross) |
|
| 47 |
+
| B β Shared-Tokenizer (WeDLM teacher) | KL baseline | [distill-WeDLM-KL](https://huggingface.co/TIDE-dllm/distill-WeDLM-KL) |
|
| 48 |
+
|
| 49 |
+
## π Released datasets
|
| 50 |
+
|
| 51 |
+
Pre-tokenized SFT mixtures (`tulu-3-sft-mixture` + `smoltalk` + `opc-sft-stage1` + `opc-sft-stage2`) prepared for each teacher, so distillation jobs never have to re-tokenize at startup.
|
| 52 |
+
|
| 53 |
+
| Pipeline | Repo |
|
| 54 |
+
|---|---|
|
| 55 |
+
| A β for the LLaDA2 teacher | [distill_llada2_sft](https://huggingface.co/datasets/TIDE-dllm/distill_llada2_sft) |
|
| 56 |
+
| B β for the WeDLM teacher | [distill_wedlm_sft](https://huggingface.co/datasets/TIDE-dllm/distill_wedlm_sft) |
|
| 57 |
+
|
| 58 |
+
## π Quick start
|
| 59 |
+
|
| 60 |
+
```python
|
| 61 |
+
import torch
|
| 62 |
+
from transformers import AutoModelForMaskedLM, AutoTokenizer
|
| 63 |
+
|
| 64 |
+
repo = "TIDE-dllm/distill-LLaDA2-TIDE_Cross" # paper-best Pipeline-A checkpoint
|
| 65 |
+
device = "cuda" if torch.cuda.is_available() else "cpu"
|
| 66 |
+
|
| 67 |
+
model = AutoModelForMaskedLM.from_pretrained(
|
| 68 |
+
repo, dtype=torch.bfloat16, trust_remote_code=True,
|
| 69 |
+
).to(device).eval()
|
| 70 |
+
tokenizer = AutoTokenizer.from_pretrained(repo, trust_remote_code=True)
|
| 71 |
+
```
|
| 72 |
+
|
| 73 |
+
The same `generate()` routine published with [`dllm-hub/Qwen3-0.6B-diffusion-bd3lm-v0.1`](https://huggingface.co/dllm-hub/Qwen3-0.6B-diffusion-bd3lm-v0.1) works on every TIDE checkpoint β just swap the model name.
|
| 74 |
+
|
| 75 |
+
## π Citation
|
| 76 |
+
|
| 77 |
+
```bibtex
|
| 78 |
+
@misc{zhang2026turningtidecrossarchitecturedistillation,
|
| 79 |
+
title={Turning the TIDE: Cross-Architecture Distillation for Diffusion Large Language Models},
|
| 80 |
+
author={Gongbo Zhang and Wen Wang and Ye Tian and Li Yuan},
|
| 81 |
+
year={2026},
|
| 82 |
+
eprint={2604.26951},
|
| 83 |
+
archivePrefix={arXiv},
|
| 84 |
+
primaryClass={cs.CL},
|
| 85 |
+
url={https://arxiv.org/abs/2604.26951},
|
| 86 |
+
}
|
| 87 |
+
```
|
logo.gif
ADDED
|
Git LFS Details
|