Update README.md
Browse files
README.md
CHANGED
|
@@ -5,11 +5,11 @@ license: apache-2.0
|
|
| 5 |
|
| 6 |
Code for [paper](https://arxiv.org/pdf/2403.03542) DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE Pre-Training (ICML'2024). It pretrains neural operator transformers (from **7M** to **1B**) on multiple PDE datasets. We will release the pre-trained weights soon.
|
| 7 |
|
| 8 |
-
|
| 9 |
|
| 10 |
Our pre-trained DPOT achieves the state-of-the-art performance on multiple PDE datasets and could be used for finetuning on different types of downstream PDE problems.
|
| 11 |
|
| 12 |
-
|
| 13 |
|
| 14 |
|
| 15 |
|
|
@@ -26,8 +26,16 @@ We have five pre-trained checkpoints of different sizes.
|
|
| 26 |
| Huge | 2048 | 8092 | 27 | 8 | 1.03B |
|
| 27 |
|
| 28 |
|
|
|
|
| 29 |
|
| 30 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 31 |
|
| 32 |
All datasets are stored using hdf5 format, containing `data` field. Some datasets are stored with individual hdf5 files, others are stored within a single hdf5 file.
|
| 33 |
|
|
|
|
| 5 |
|
| 6 |
Code for [paper](https://arxiv.org/pdf/2403.03542) DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE Pre-Training (ICML'2024). It pretrains neural operator transformers (from **7M** to **1B**) on multiple PDE datasets. We will release the pre-trained weights soon.
|
| 7 |
|
| 8 |
+

|
| 9 |
|
| 10 |
Our pre-trained DPOT achieves the state-of-the-art performance on multiple PDE datasets and could be used for finetuning on different types of downstream PDE problems.
|
| 11 |
|
| 12 |
+

|
| 13 |
|
| 14 |
|
| 15 |
|
|
|
|
| 26 |
| Huge | 2048 | 8092 | 27 | 8 | 1.03B |
|
| 27 |
|
| 28 |
|
| 29 |
+
#### Loading pre-trained model
|
| 30 |
|
| 31 |
+
Here is an example code of loading pre-trained model.
|
| 32 |
+
```
|
| 33 |
+
model = DPOTNet(img_size=128, patch_size=8, mixing_type='afno', in_channels=4, in_timesteps=10, out_timesteps=1, out_channels=4, normalize=False, embed_dim=512, modes=32, depth=4, n_blocks=4, mlp_ratio=1, out_layer_dim=32, n_cls=12)
|
| 34 |
+
model.load_state_dict(torch.load('model_Ti.pth')['model'])
|
| 35 |
+
```
|
| 36 |
+
|
| 37 |
+
|
| 38 |
+
#### Datasets
|
| 39 |
|
| 40 |
All datasets are stored using hdf5 format, containing `data` field. Some datasets are stored with individual hdf5 files, others are stored within a single hdf5 file.
|
| 41 |
|