YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

TensorRT ONNX External Data Offset Crash PoC

Vulnerability

A crafted ONNX model with an external_data weight reference containing a negative offset value (-1) crashes TensorRT's engine builder with STATUS_ACCESS_VIOLATION (0xC0000005 on Windows / SIGSEGV on Linux).

  • Model size: 185 bytes (+ 64-byte weight file = 249 bytes total)
  • Crash rate: 100% (10/10 runs)
  • Affected phase: build_serialized_network() (parse succeeds with no error)
  • Tested on: TensorRT 10.15.1.29, Windows, CUDA 12.x

Root Cause

The ONNX external_data offset field is int64 in the protobuf spec. TensorRT's WeightsContext.cpp::parseExternalWeights() does not validate the offset before passing it to seekg(). Negative values cause undefined behavior in file I/O, producing garbage weight data that crashes the builder during optimization.

All negative offsets crash. All offsets >= ~2^32 also crash.

Files

File Description
crash_offset_neg1.onnx Malicious ONNX model (offset=-1) - CAUSES CRASH
benign_offset_0.onnx Benign ONNX model (offset=0) - builds normally
weights.bin Weight file (64 bytes, required by both models)
reproduce.py Reproduction script

Reproduction

pip install tensorrt onnx numpy torch
python reproduce.py

Expected output:

[1] Benign model (offset=0):
  benign: rc=0 BUILD_OK size=...

[2] Malicious model (offset=-1):
  malicious: CRASH (STATUS_ACCESS_VIOLATION 0xC0000005)

[3] Reproducibility (5 runs):
  run 1: CRASH (STATUS_ACCESS_VIOLATION 0xC0000005)
  run 2: CRASH (STATUS_ACCESS_VIOLATION 0xC0000005)
  ...
  Crash rate: 5/5

Impact

Any TensorRT pipeline that accepts untrusted ONNX models and compiles them will crash:

  • NVIDIA Triton Inference Server
  • TensorRT-LLM ONNX compilation
  • MLOps platforms accepting user-submitted models
  • CI/CD pipelines compiling ONNX models

Severity

High (CVSS 3.1: 7.5 -- AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H)

Potential for memory corruption escalation beyond DoS.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support