maxxxzdn commited on
Commit
32b88b6
verified
1 Parent(s): 127cde1

README: reorder bottom sections; drop Field/Value header

Browse files
Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -216,6 +216,14 @@ gs://weatherbench2/datasets/hres_t0/2016-2022-6h-240x121_equiangular_with_poles_
216
 
217
  Mosaic operates at 1.5掳 (~166 km), which cannot resolve mesoscale phenomena such as tropical-cyclone inner-core structure or individual severe thunderstorms. The block-sparse attention is designed to scale linearly with sequence length, so finer grids (e.g. 0.25掳, ~700k tokens) are a natural next step but are not part of this release.
218
 
 
 
 
 
 
 
 
 
219
  ## License
220
 
221
  Released under [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/). Free for non-commercial research and educational use with attribution; commercial use requires a separate license. Underlying training data (ERA5, HRES) is subject to its own licensing terms set by ECMWF.
@@ -224,14 +232,6 @@ Released under [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/).
224
 
225
  MZ acknowledges support from Microsoft Research AI4Science. JWvdM acknowledges support from the European Union Horizon Framework Programme (Grant agreement ID: 101120237). This work used the Dutch national e-infrastructure with the support of the SURF Cooperative using grant no. EINF-16923. Computations were partially performed using the UvA/FNWI HPC Facility.
226
 
227
- ## Model card metadata
228
-
229
- | Field | Value |
230
- |---------------|-------|
231
- | License | [`cc-by-nc-4.0`](https://creativecommons.org/licenses/by-nc/4.0/) |
232
- | Library | `pytorch` |
233
- | Tags | `weather` 路 `weather-forecasting` 路 `climate` 路 `atmospheric-science` 路 `sparse-attention` 路 `transformer` 路 `probabilistic-forecasting` |
234
-
235
  ## Citation
236
 
237
  If you use Mosaic, please cite:
 
216
 
217
  Mosaic operates at 1.5掳 (~166 km), which cannot resolve mesoscale phenomena such as tropical-cyclone inner-core structure or individual severe thunderstorms. The block-sparse attention is designed to scale linearly with sequence length, so finer grids (e.g. 0.25掳, ~700k tokens) are a natural next step but are not part of this release.
218
 
219
+ ## Model card metadata
220
+
221
+ | | |
222
+ |---------|---|
223
+ | License | [`cc-by-nc-4.0`](https://creativecommons.org/licenses/by-nc/4.0/) |
224
+ | Library | `pytorch` |
225
+ | Tags | `weather` 路 `weather-forecasting` 路 `climate` 路 `atmospheric-science` 路 `sparse-attention` 路 `transformer` 路 `probabilistic-forecasting` |
226
+
227
  ## License
228
 
229
  Released under [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/). Free for non-commercial research and educational use with attribution; commercial use requires a separate license. Underlying training data (ERA5, HRES) is subject to its own licensing terms set by ECMWF.
 
232
 
233
  MZ acknowledges support from Microsoft Research AI4Science. JWvdM acknowledges support from the European Union Horizon Framework Programme (Grant agreement ID: 101120237). This work used the Dutch national e-infrastructure with the support of the SURF Cooperative using grant no. EINF-16923. Computations were partially performed using the UvA/FNWI HPC Facility.
234
 
 
 
 
 
 
 
 
 
235
  ## Citation
236
 
237
  If you use Mosaic, please cite: