maxxxzdn commited on
Commit
28f6cf0
·
verified ·
1 Parent(s): 183b008

Add 'Getting the weights' note; align with GitHub mirror

Browse files
Files changed (2) hide show
  1. .gitignore +8 -2
  2. README.md +18 -0
.gitignore CHANGED
@@ -6,7 +6,13 @@ __pycache__/
6
  dist/
7
  build/
8
  .env
9
- # norm_stats and static_vars .npz are bundled release assets — track via LFS (see .gitattributes)
10
- # generated forecast outputs only:
 
 
 
 
 
 
11
  verify_*.npz
12
  forecast_*.npz
 
6
  dist/
7
  build/
8
  .env
9
+
10
+ # Model checkpoints are not stored in git — fetch from Hugging Face
11
+ # (huggingface.co/maxxxzdn/mosaic) instead. On HF they live as LFS objects.
12
+ *.pt
13
+ *.pth
14
+ *.ckpt
15
+
16
+ # Generated forecast outputs (kept out of git but bundled .npz assets are tracked)
17
  verify_*.npz
18
  forecast_*.npz
README.md CHANGED
@@ -104,6 +104,24 @@ For reading data from Google Cloud Storage (WeatherBench2 zarr stores):
104
  pip install gcsfs
105
  ```
106
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
107
  ## Quick Start
108
 
109
  ```bash
 
104
  pip install gcsfs
105
  ```
106
 
107
+ ## Getting the weights
108
+
109
+ If you installed via Hugging Face (`huggingface-cli download maxxxzdn/mosaic --local-dir .`), the checkpoints (`era5_best.pt`, `hres_best.pt`) and normalization stats are already bundled and `inference.py` finds them in the working directory.
110
+
111
+ If you cloned this repo from GitHub instead, the weights are not in the git tree (they live on the [Hugging Face mirror](https://huggingface.co/maxxxzdn/mosaic) as LFS objects). Fetch them with:
112
+
113
+ ```bash
114
+ pip install huggingface_hub
115
+ huggingface-cli download maxxxzdn/mosaic --local-dir . # pulls .pt + .npz assets
116
+ ```
117
+
118
+ or programmatically:
119
+
120
+ ```python
121
+ from huggingface_hub import snapshot_download
122
+ snapshot_download(repo_id="maxxxzdn/mosaic", local_dir=".")
123
+ ```
124
+
125
  ## Quick Start
126
 
127
  ```bash