--- license: mit license_link: LICENSE library_name: kups tags: - chemistry - materials-science - molecular-dynamics - interatomic-potential - mlff - jax - mace --- # MACE-MPA-0 medium — JAX build This repository hosts a JAX export of [MACE-MPA-0 medium](https://github.com/ACEsuit/mace-foundations) for use with [kUPS](https://github.com/cusp-ai-oss/kups), a JAX-native molecular-simulation toolkit. The artefact is a self-contained `.zip` containing the serialized JAX computation graph, the original model parameters, and the minimal metadata needed to run inference. **Important:** this is a **re-export, not a retraining.** Weights and architecture are the originals released by the MACE team. CuspAI's only contribution is converting the PyTorch reference implementation to JAX via [tojax](https://github.com/cusp-ai-oss/tojax). Every scientific claim, citation, and credit belongs to the original authors. ## Included model | File | Upstream | License | Paper | |------|----------|---------|-------| | `mace-mpa-0-medium_32.zip` (40 MB) | [MACE-MPA-0 medium](https://github.com/ACEsuit/mace-foundations) | [MIT](LICENSE) | Batatia et al., *J. Chem. Phys.* **163**, 184110 (2024); [arXiv:2401.00096](https://arxiv.org/abs/2401.00096) | Cutoff radius 6.0 Å. Schema: `AtomGraphInput` (positions, atomic numbers, cell, pbc, edge index, cell offsets, batch index, charge, spin). ## Quick start ```sh pip install kups[cuda] ``` ```python from huggingface_hub import hf_hub_download from kups.potential.mliap.tojax import TojaxedMliap path = hf_hub_download( repo_id="CuspAI/kUPS-mace-jax", filename="mace-mpa-0-medium_32.zip", ) model = TojaxedMliap.from_zip_file(path) # model.cutoff, model.params, model.call(atom_graph_input) are now available. ``` kUPS ships CLI wrappers that take a YAML config pointing at this zip: ```sh kups_relax_mlff --config relax_mace.yaml ``` Example configs live in the kUPS [examples/](https://github.com/cusp-ai-oss/kups/tree/main/examples) directory. ## What's in the `.zip` - `model.jax` — JAX computation graph, serialized via `jax.export`. - `params.msgpack` — parameters as a msgpack-encoded list of arrays. - `metadata.json` — cutoff radius and supported atomic numbers. - `dtypes.json` — input dtypes for `AtomGraphInput`. Exported with symbolic shapes (`--symbolic NSE`); accepts variable atom, system, and edge counts without recompilation. ## Model details **Upstream:** [ACEsuit/mace-foundations](https://github.com/ACEsuit/mace-foundations) · **Checkpoint:** `mace-mpa-0-medium.model` MACE-MPA-0 is a foundation-model member of the MACE family (Multi-ACE, a higher-body-order message-passing equivariant neural network), trained on the MPtrj + sAlex materials dataset covering 89 chemical elements. The "medium" variant is the default recommendation from the MACE team for general-purpose materials simulation. **Original authors:** Ilyes Batatia, Philipp Benner, Yuan Chiang, Alin M. Elena, Dávid P. Kovács, Janosh Riebesell, and the rest of the MACE team (~100 collaborators). Corresponding author: Gábor Csányi (Cambridge). **Supported elements:** atomic numbers 1–83, 89–94 (see `metadata.json` inside the zip). **Intended use and limitations:** general-purpose materials modelling (energies, forces, stresses) at DFT/PBE+U accuracy. Not trained for isolated molecules; for molecular systems refer to MACE-OFF. See the [upstream README](https://github.com/ACEsuit/mace-foundations) for authoritative guidance. **Citation:** ```bibtex @article{batatia2024foundation, title = {A foundation model for atomistic materials chemistry}, author = {Batatia, Ilyes and Benner, Philipp and Chiang, Yuan and Elena, Alin M. and Kov{\'a}cs, D{\'a}vid P. and Riebesell, Janosh and others}, journal = {The Journal of Chemical Physics}, volume = {163}, number = {18}, pages = {184110}, year = {2024}, doi = {10.1063/5.0230281}, eprint = {2401.00096}, archivePrefix = {arXiv}, } ``` ## Export pipeline and reproducibility The archive was produced with the exporter in [tojax/examples/mlff/](https://github.com/cusp-ai-oss/tojax): ```sh uv run python export_mace.py --output mace-mpa-0-medium_32.zip --symbolic NSE ``` tojax's export harness verifies numerical agreement with the PyTorch reference (default tolerances `rtol=1e-4`, `atol=1e-4`) before saving the archive. ## Changes from upstream - **File format.** PyTorch `.model` → JAX-exported `.zip` (graph + msgpack params). - **Weights.** Unchanged, bit-for-bit, from the upstream checkpoint. - **Architecture.** Translated operation-for-operation; no approximations or substitutions. - **Numerics.** Verified within `rtol=1e-4, atol=1e-4` against the PyTorch reference during export. - **Cutoff and dtypes.** Preserved from upstream defaults (6.0 Å; float32 positions/cell/offsets, int64 indices, bool PBC). - **Symbolic shapes.** Exports accept variable numbers of atoms, systems, and edges without recompilation. ## Attribution and license This model exists because of the work of the MACE team. CuspAI's only contribution is the JAX export — we trained nothing, changed no weights, and designed none of the architecture. Please cite Batatia et al. (2024) when using this checkpoint in research. The file `mace-mpa-0-medium_32.zip` is distributed under the MIT License; see [LICENSE](LICENSE) for the full text and [NOTICE](NOTICE) for attribution and modification details. The kUPS / tojax tooling citations: ```bibtex @software{kups2026, author = {{CuspAI}}, title = {kUPS}, year = {2026}, url = {https://github.com/cusp-ai-oss/kups}, } @software{tojax2026, author = {{CuspAI}}, title = {tojax}, year = {2026}, url = {https://github.com/cusp-ai-oss/tojax}, } ``` ## Contact - Issues with the JAX export or with kUPS: [github.com/cusp-ai-oss/kups/issues](https://github.com/cusp-ai-oss/kups/issues) - Scientific questions about MACE: please direct to the upstream authors via [ACEsuit/mace-foundations](https://github.com/ACEsuit/mace-foundations).