TimesFM 2.5 (ONNX)

Author: Paul Dufour (https://www.linkedin.com/in/pauldufour/)

TimesFM (Time Series Foundation Model) is a pretrained decoder-only model for time-series forecasting made by Google. This repository contains the onnx version of the transformers version https://huggingface.co/google/timesfm-2.5-200m-transformers.

Original model: https://huggingface.co/google/timesfm-2.5-200m-transformers

Credit: All credit for the model architecture, training, and published weights belongs to the original authors and the upstream checkpoint linked above. This repo only provides the ONNX export.

The ONNX graph takes past_values as float32 [batch, context_length] (up to 16384 per row) and returns last_hidden_state, mean_predictions, and full_predictions (in that order). Keep the whole onnx/ directory when copying weights (external_data=True).

Usage

Onnx runtime

Inference with ONNX Runtime.

python -m venv .venv
source .venv/bin/activate
pip install onnxruntime numpy
import numpy as np
import onnxruntime as ort

session = ort.InferenceSession("onnx/model.onnx", providers=["CPUExecutionProvider"])
inp = session.get_inputs()[0]

batch_size, context_len = 2, 1024
past_values = np.random.randn(batch_size, context_len).astype(np.float32)

_, mean_predictions, full_predictions = session.run(
    None, {inp.name: past_values}
)
print(mean_predictions.shape)   # (batch_size, horizon_length)
print(full_predictions.shape)  # (batch_size, horizon_length, num_quantiles)

Compare to PyTorch version

Example showing differences between the Transformers version and ONNX Runtime version and confirm the forecasts line up.

python example_compare_to_pytorch.py

Re-exporting

You only need this if you want to regenerate onnx/model.onnx from Transformers. Until PR #45233 is merged into transformers, install Transformers from the PR head (or your fork branch that contains the same _preprocess change):

python -m venv .venv
source .venv/bin/activate

pip install torch onnx numpy onnxruntime onnxscript
pip install "git+https://github.com/huggingface/transformers.git@refs/pull/45233/head"

python export.py

The ONNX model is written under onnx/.

Uploading to the Hub

Upload the ONNX with hf upload, then commit and push everything else with git (README.md, config.json, scripts, etc.).

hf upload pdufour/timesfm-2.5-200m-transformers-onnx onnx/model.onnx onnx/model.onnx --repo-type model
# hf upload pdufour/timesfm-2.5-200m-transformers-onnx onnx/model.onnx.data onnx/model.onnx.data --repo-type model  # if external data

git add -A && git commit -m "update" && git push

Model Links

Citation

@inproceedings{das2024a,
  title={A decoder-only foundation model for time-series forecasting},
  author={Abhimanyu Das and Weihao Kong and Rajat Sen and Yichen Zhou},
  booktitle={Forty-first International Conference on Machine Learning},
  year={2024},
  url={https://openreview.net/forum?id=jn2iTJas6h}
}
Downloads last month
123
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for pdufour/timesfm-2.5-200m-transformers-onnx

Quantized
(1)
this model

Collection including pdufour/timesfm-2.5-200m-transformers-onnx

Paper for pdufour/timesfm-2.5-200m-transformers-onnx