WaveGuard: Physics-Based Time-Series Anomaly Detection
Zero-training anomaly detection using Klein-Gordon wave equation dynamics on a 3D lattice.
No neural networks. No gradient descent. No hyperparameter tuning. Deterministic, interpretable, and instantly deployable.
How It Works
WaveGuard encodes your data onto a 3D lattice and evolves it under two coupled wave equations:
- Plastic Phase: Normal data sculpts the lattice's
chifield (mass landscape) - Frozen Phase: Test data propagates through the learned landscape. Normal data resonates; anomalies scatter.
The physics fingerprint (52-dimensional) captures the wave response, and Mahalanobis distance gives the anomaly score.
The Equations
GOV-01: E(n+1) = 2E(n) - E(n-1) + dt^2 * [Laplacian(E) - chi^2 * E]
GOV-02: chi(n+1) = 2chi(n) - chi(n-1) + dt^2 * [Laplacian(chi) - kappa * E^2]
Where chi_0 = 19 (vacuum stiffness) and kappa = 1/63 (coupling constant).
Quick Start
Install
pip install WaveGuardClient
Python SDK
from waveguard_client import WaveGuardClient
client = WaveGuardClient()
# Create baseline from normal data
client.create_baseline("my_service", [
{"cpu": 45, "memory": 62, "latency_ms": 120},
{"cpu": 48, "memory": 65, "latency_ms": 115},
# ... more normal samples
])
# Detect anomalies
result = client.detect({"cpu": 95, "memory": 92, "latency_ms": 850}, "my_service")
print(result)
# {'is_anomaly': True, 'score': 312.5, 'confidence': 0.999, ...}
One-Shot Detection (No Setup)
training_data = [normal_sample_1, normal_sample_2, ...]
test_data = [sample_to_check_1, sample_to_check_2, ...]
results = client.scan(training_data, test_data)
for r in results:
print(f"Score: {r['score']:.2f} Anomaly: {r['is_anomaly']}")
Benchmarks
| Dataset | Precision | Recall | F1 | Latency/sample |
|---|---|---|---|---|
| Server Metrics (5 features, 50 train) | 0.789 | 1.000 | 0.882 | 8ms |
| Synthetic TS - Sinusoidal | 0.625 | 0.500 | 0.556 | 9ms |
| Synthetic TS - Seasonal | 0.400 | 0.600 | 0.480 | 8ms |
| Synthetic TS - Random Walk | 0.545 | 0.600 | 0.571 | 8ms |
| Synthetic TS - Trend | 0.400 | 0.200 | 0.267 | 8ms |
CPU-only, N=24 grid, 30-50 training samples. Larger grids and more training data improve results. GPU (CuPy) gives 10-50x speedup.
Supported Data Types
| Type | Encoder | Example |
|---|---|---|
| JSON/Dict | json |
{"cpu": 45, "mem": 62} |
| Time Series | timeseries |
[1.2, 3.4, 5.6, ...] |
| Numeric Array | numeric_array |
[0.1, 0.2, 0.3] |
| Tabular (CSV) | tabular |
Pandas DataFrame rows |
| Text | text |
Any string (hashed) |
| Image | image |
2D/3D numpy arrays |
Key Advantages
| Feature | WaveGuard | IsolationForest | LOF | AutoEncoder |
|---|---|---|---|---|
| Training needed | None (physics) | Yes (trees) | Yes (neighbors) | Yes (epochs) |
| Hyperparameters | 1 (sensitivity) | 5+ | 3+ | 10+ |
| Deterministic | Yes | No (random) | Yes | No (init) |
| Streaming | Yes | No | No | No |
| GPU acceleration | Optional | No | No | Required |
| Interpretable | Yes (chi landscape) | Partial | No | No |
Architecture
Input Data --> Encoder --> 3D Lattice (N^3)
|
GOV-01 + GOV-02 Evolution
|
52-dim Physics Fingerprint
|
Mahalanobis Distance Score
|
Anomaly Decision + Confidence
Model Details
- Type: Physics-based (no learned parameters beyond chi landscape)
- Grid size: Adaptive (16-256 based on input dimensionality)
- Fingerprint: 52 dimensions (chi statistics + E-field response + histogram)
- Scoring: Multi-resolution (global Mahalanobis + per-feature z-scores)
- Threshold: Adaptive (baseline mean + 2/sensitivity * baseline std)
Limitations
- Best for structured/semi-structured data (JSON, time-series, tabular)
- CPU-only mode is slower for large grids (N>64)
- Requires at least 5 normal samples for stable baseline
- Not designed for image anomaly detection (use Anomalib instead)
Citation
@software{waveguard2025,
title={WaveGuard: Physics-Based Anomaly Detection},
author={Partin, Greg},
year={2025},
url={https://huggingface.co/gpartin/waveguard-timeseries-ad}
}
Links
- Downloads last month
- 38