File size: 2,331 Bytes
7c95b83
 
 
 
 
 
 
 
 
 
 
 
35b869c
7c95b83
 
 
 
35b869c
7c95b83
35b869c
7c95b83
 
 
 
 
 
 
35b869c
7c95b83
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
35b869c
7c95b83
 
 
 
 
 
 
35b869c
7c95b83
 
 
 
 
 
35b869c
 
 
7c95b83
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- molecules
- chemistry
- pretraining
- graph-llm
- smiles
pretty_name: EDT-Former Encoder Pretraining Data
size_categories:
- 10M<n<100M
---

# EDT-Former Encoder Pretraining Data

Pretraining corpus for the **EDT-Former** Stage 1 encoder, as described in the ICLR 2026 paper:

> **Entropy-Guided Dynamic Tokens for Graph-LLM Alignment in Molecular Understanding**
> Zihao Jing, Qiuhao Zeng, Ruiyi Fang, Yan Sun, Boyu Wang, Pingzhao Hu
> *ICLR 2026* · [Paper](https://www.arxiv.org/abs/2602.02742) · [Code](https://github.com/selmiss/DQ-Former)

## Dataset Description

This dataset is used to train the Stage 1 EDT-Former encoder, which learns to align molecular graph representations with text. It is derived from PubChem molecules annotated with BRICS fragment IDs and entropy-guided graph IDs.

## Data Format

Each split is stored as a JSONL file where each line is a JSON object representing one molecule-text pair, including graph features, SMILES, and associated text descriptions.

| Split | File | Size |
|-------|------|------|
| Train | `train.jsonl` | ~12 GB |
| Validation | `val.jsonl` | ~37 MB |
| Test | `test.jsonl` | ~78 MB |

## Usage

```python
from datasets import load_dataset

dataset = load_dataset("zihaojing/EDT-Former-pretrain-data")
```

Or point to it in the EDT-Former config:

```yaml
# configs/stage1_dqw2d/data_config_preprocessed.yaml
use_preprocessed: true
preprocessed_data: zihaojing/EDT-Former-pretrain-data
```

## Related Resources

| Resource | Link |
|----------|------|
| SFT Data | [zihaojing/EDT-Former-sft-data](https://huggingface.co/datasets/zihaojing/EDT-Former-sft-data) |
| Encoder (Stage 1) | [zihaojing/EDT-Former-encoder](https://huggingface.co/zihaojing/EDT-Former-encoder) |
| Full Model (Stage 2) | [zihaojing/EDT-Former-model](https://huggingface.co/zihaojing/EDT-Former-model) |
| Code | [selmiss/DQ-Former](https://github.com/selmiss/DQ-Former) |

## Citation

```bibtex
@inproceedings{jing2026edtformer,
  title={Entropy-Guided Dynamic Tokens for Graph-LLM Alignment in Molecular Understanding},
  author={Jing, Zihao and Zeng, Qiuhao and Fang, Ruiyi and Sun, Yan and Wang, Boyu and Hu, Pingzhao},
  booktitle={International Conference on Learning Representations (ICLR)},
  year={2026}
}
```