The quest for the GRAph Level autoEncoder (GRALE)
Paper • 2505.22109 • Published
GRALE is a novel graph autoencoder that encodes and decodes graphs of varying sizes into a shared embedding space. Built on an Evoformer-based attention architecture (AlphaFold's core component), GRALE uses an Optimal Transport-inspired loss and differentiable node matching to enable general pre-training for diverse downstream tasks—from classification and regression to graph interpolation, editing, matching, and prediction.
from GRALE.main import GRALE_model
from huggingface_hub import hf_hub_download
# Load pretrained model
checkpoint = hf_hub_download(repo_id="PaulKrzakala/GRALE-128-32", filename="last.ckpt")
model = GRALE_model.load_from_checkpoint(checkpoint)
# Encode and decode graphs
embeddings = model.encode(graph_data)
reconstructed = model.decode(embeddings)
See demo.ipynb for full examples.
# Python Version: 3.11.14
# Clone the repo
git clone https://github.com/KrzakalaPaul/GRALE.git
cd GRALE
# Create your environment
python3 -m venv .venv
source .venv/bin/activate
# Install dependencies
pip install torch==2.8.0 # --index-url https://download.pytorch.org/whl/cu126 (cuda version)
pip install -r requirements.txt
@article{krzakala2025quest,
title={The quest for the GRAph Level autoEncoder (GRALE)},
author={Krzakala, Paul and Melo, Gabriel and Laclau, Charlotte and d'Alch{\'e}-Buc, Florence and Flamary, R{\'e}mi},
journal={arXiv preprint arXiv:2505.22109},
year={2025}
}