Datasets:
Add Dataset Card description (README.md)
Browse files
README.md
CHANGED
|
@@ -1,109 +1,76 @@
|
|
| 1 |
---
|
| 2 |
license: gpl-3.0
|
| 3 |
-
pretty_name: "Metis-OLMoE Latent Telemetry: Spikenaut SNN Routing"
|
| 4 |
-
language:
|
| 5 |
-
- en
|
| 6 |
-
- code
|
| 7 |
-
tags:
|
| 8 |
-
- snn
|
| 9 |
-
- spiking-neural-network
|
| 10 |
-
- olmoe
|
| 11 |
-
- mixture-of-experts
|
| 12 |
-
- quantization
|
| 13 |
-
- neuromorphic
|
| 14 |
-
- telemetry
|
| 15 |
-
- semantic-routing
|
| 16 |
-
- latent-space
|
| 17 |
-
- cuda
|
| 18 |
-
- rust
|
| 19 |
-
- gguf
|
| 20 |
-
- saaq
|
| 21 |
-
datasets:
|
| 22 |
-
- allenai/OLMoE-1B-7B-0125-Instruct-GGUF
|
| 23 |
task_categories:
|
| 24 |
-
|
| 25 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 26 |
---
|
| 27 |
|
| 28 |
-
# Metis
|
| 29 |
-
|
| 30 |
-
## 1. The Origins of Metis
|
| 31 |
-
|
| 32 |
-
Before this was a formal dataset, it was an attempt to solve a bare-metal problem. I had been experimenting with mining telemetry, HFT bots, and sync node data to train a spiking neural network (SNN), but the data kept returning dead zeros in value after being used for training.
|
| 33 |
-
|
| 34 |
-
The breakthrough came entirely by accident. I was running heavy mods—DLSS 4.0 and path tracing—on Cyberpunk 2077 and the Resident Evil 4 Remake. My workstation PC was screaming, pushing harder and louder than it ever did during crypto mining. That sparked the realization: *What if I used raw gaming telemetry data for neuromorphic spike data conversion?* What if I could use this intense hardware stress to create an artificial heartbeat for AI?
|
| 35 |
-
|
| 36 |
-
When I pitched this idea, most people didn't believe the spike data conversion would work. But after refining the early thermal equations using that Resident Evil 4 telemetry, **Metis** was born—a MoE-based SNN quantizaton model and dataset architecture for exploring SNN quantization.
|
| 37 |
-
|
| 38 |
-
### Relationship to Spikenaut
|
| 39 |
-
|
| 40 |
-
**Spikenaut** is my pure SNN model, built from scratch as a native spiking neural network. **Metis** (this repository) serves as the architect and teacher—exploring SNN quantization techniques through the OLMoE Mixture-of-Experts model. The discoveries, equations, and architecture frameworks developed here feed directly into Spikenaut's training and evolution. Metis proves the math; Spikenaut implements it natively.
|
| 41 |
-
|
| 42 |
-
## 2. The Science: Semantic Attractor Clustering
|
| 43 |
-
|
| 44 |
-
This dataset contains the raw bare-metal telemetry logs and latent space visualizations generated by the routing encoder. The objective is to map the physical routing of LLM embeddings (specifically from the allenai/OLMoE-1B-7B-0125-Instruct-GGUF Mixture of Experts model) as they are processed by biologically-inspired neuronal fatigue mechanics. These insights directly inform the training of **Spikenaut**, my pure native SNN.
|
| 45 |
-
|
| 46 |
-
### The Discovery: Physical Neighborhood Mapping
|
| 47 |
-
|
| 48 |
-
The primary breakthrough documented in this dataset is the organic, physical separation of semantic concepts into distinct routing bands. By applying L2 Normalization to the embeddings, the network bounds semantic pressure, forcing tokens to follow the biological **path of least resistance**.
|
| 49 |
-
|
| 50 |
-
Telemetry visualizations prove that the Spike-based routing physically routes different cognitive tasks into isolated biological neighborhoods:
|
| 51 |
-
|
| 52 |
-
#### Abstract Language Routing (The 2000-Route)
|
| 53 |
-
When fed abstract English logic, the network distributes energy across multiple nodes, establishing a dominant attractor basin at the **2000-index walker route**, with secondary echoes in Walkers 700 and 1450.
|
| 54 |
-

|
| 55 |
-
|
| 56 |
-
#### Structured Logic Routing (The 600-800 Band)
|
| 57 |
-
When fed rigid mathematical statements or raw Rust syntax, the network completely abandons the 2000-route. The tokens experience mathematical pushback in abstract centers and organically collapse into the exact same **600-800 frequency band**. This demonstrates that the network physically maps highly structured logic tasks to adjacent biological neighborhoods to conserve energy.
|
| 58 |
-

|
| 59 |
-

|
| 60 |
-
|
| 61 |
-
## 3. Experiment Progression
|
| 62 |
-
The dataset documents the chronological progression from synthetic baselines to actual semantic routing:
|
| 63 |
|
| 64 |
-
|
| 65 |
-
* **Input:** Synthetic sine wave.
|
| 66 |
-
* **Result:** Verified the GPU temporal loop (10,000 ticks) and basic biological fatigue without crashing the CUDA context.
|
| 67 |
-
* **Phase 2: The F16 Magnitude Collapse (Unbounded)**
|
| 68 |
-
* **Input:** Real LLM embeddings (OLMoE).
|
| 69 |
-
* **Result:** Unscaled F16-to-F32 extraction resulted in raw, unbounded electrical pressure. A single expert neuron (Walker ~620) was overwhelmed, causing a routing collapse where one walker took the entire load for the full temporal loop.
|
| 70 |
-
* **Phase 3: L2 Normalization & Philosophy Attractors**
|
| 71 |
-
* **Input:** `"Let's teach this MoE model..."` (Abstract English).
|
| 72 |
-
* **Result:** L2 Normalization successfully shattered the routing collapse. Energy dynamically settled into high-register attractor bands, predominantly isolating into the **2000-route**.
|
| 73 |
-
* **Phase 4: Semantic Clustering (Code & Math Logic)**
|
| 74 |
-
* **Input A:** `fn main() { println!("Hello, World!"); }` (Rust Syntax)
|
| 75 |
-
* **Input B:** `"The derivative of a constant is mathematically zero."` (Math Logic)
|
| 76 |
-
* **Result:** The SNN abandoned the 2000-route completely. Both raw Rust syntax and mathematical logic organically fell into the exact same **600-800 frequency band**. This demonstrates that the network physically maps highly structured logic tasks to adjacent biological neighborhoods.
|
| 77 |
|
| 78 |
-
|
|
|
|
|
|
|
|
|
|
| 79 |
|
| 80 |
-
This
|
| 81 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 82 |
|
|
|
|
| 83 |
|
| 84 |
-
|
| 85 |
-
These files are used to study spiking behavior, routing stability, and adaptive quantization (SAAQ) in SNN-converted MoE models. The data feeds SymbolicRegression.jl to discover new equations for improved SNN quantization and ultimately trains the pure native **Spikenaut** SNN.
|
| 86 |
|
| 87 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 88 |
|
| 89 |
-
|
| 90 |
-
- **first-day/** — Early experimental runs (optional)
|
| 91 |
-
- **SAAQ 3.0/** — Future runs with new algorithm versions
|
| 92 |
-
- **experiments/** — Additional test configurations and variants
|
| 93 |
-
- **results/**
|
| 94 |
-
- **plots/** — Visualization of SNN routing paths and firing density
|
| 95 |
-
- **raw_telemetry/** — Original tick-by-tick log files
|
| 96 |
|
| 97 |
-
|
| 98 |
-
This directory contains the foundational, bare-metal hardware telemetry that inspired the Spikenaut SAAQ thermal equations.
|
| 99 |
|
| 100 |
-
|
| 101 |
-
|
|
|
|
| 102 |
|
| 103 |
-
|
| 104 |
-
|
| 105 |
-
- [Surrogate_Viz.jl](https://github.com/Spikenaut/Surrogate_Viz.jl) — Symbolic regression and visualization
|
| 106 |
|
|
|
|
|
|
|
|
|
|
| 107 |
|
|
|
|
|
|
|
| 108 |
|
|
|
|
|
|
|
|
|
|
| 109 |
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
license: gpl-3.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 3 |
task_categories:
|
| 4 |
+
- time-series-forecasting
|
| 5 |
+
tags:
|
| 6 |
+
- neuromorphic
|
| 7 |
+
- snn
|
| 8 |
+
- liquid-state-machines
|
| 9 |
+
- gaming
|
| 10 |
+
- hardware-telemetry
|
| 11 |
+
- gpu
|
| 12 |
+
pretty_name: Metis SMoE Latent Telemetry (Gaming)
|
| 13 |
---
|
| 14 |
|
| 15 |
+
# Metis SMoE Latent Telemetry
|
| 16 |
+
## Neuromorphic Hardware Telemetry from demanding Gaming Workloads
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 17 |
|
| 18 |
+
This dataset provides high-fidelity, high-frequency (5ms interval) hardware telemetry data captured from extreme PC gaming workloads. This dataset is optimized to simulate the biological responses of a nervous system to intense stimulus (excitatory input, action potentials/firing rates, and inhibitory responses).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
|
| 20 |
+
### Context
|
| 21 |
+
The telemetry data was recorded using a custom Rust-based data collector via the NVIDIA Management Library (NVML) on a Fedora 43 Linux system. Workloads represent highly transient rendering applications including:
|
| 22 |
+
- **Resident Evil 4 (Remake)** (with rendering complexities)
|
| 23 |
+
- **Cyberpunk 2077** (Path Tracing, DLSS 4.0)
|
| 24 |
|
| 25 |
+
This system provides the rich, high-frequency time-series data required to train **Spiking Neural Networks (SNNs)** and **Liquid State Machines (LSMs)**.
|
| 26 |
|
| 27 |
+
### Neuromorphic Mapping (SNN Utility)
|
| 28 |
+
This data behaves as "sensorimotor" stimulus for neural networks:
|
| 29 |
+
- **Excitatory Inputs (Stimulus):** High surges in `pcie_rx_kbps` indicate asset floods (e.g., BVH structure updates for Path Tracing), mimicking sensory signals entering the system.
|
| 30 |
+
- **Action Potentials (Firing Rates):** `encoder_util_perc`, `decoder_util_perc`, and overall spatial `power_usage_mw` transients represent internal activity and network firing rates.
|
| 31 |
+
- **Inhibitory Inputs (Refractory/Limits):** Non-zero `throttle_reasons_bitmask` signals and thermal limits act as inhibitory governors, dynamically suppressing system activity.
|
| 32 |
+
- **State/Momentum:** Slow-moving environmental data like temperatures (`cpu_tctl_c`, `temperature_c`) and memory capacity.
|
| 33 |
|
| 34 |
+
### Data Schema
|
| 35 |
|
| 36 |
+
The data is provided natively as Parquet files partitioned into train batches (`system_telemetry_v1_batch_*.parquet`).
|
|
|
|
| 37 |
|
| 38 |
+
| Feature | Type | Description |
|
| 39 |
+
| :--- | :--- | :--- |
|
| 40 |
+
| `timestamp_ms` | `Int64` | UNIX timestamp in milliseconds (5ms interval captures). |
|
| 41 |
+
| `power_usage_mw` | `UInt32` | Total GPU power usage in milliwatts. |
|
| 42 |
+
| `temperature_c` | `Float32` | GPU core temperature in Celsius. |
|
| 43 |
+
| `pcie_rx_kbps` | `UInt32` | Incoming PCIe throughput in Kilobytes per second (Excitatory). |
|
| 44 |
+
| `pcie_tx_kbps` | `UInt32` | Outgoing PCIe throughput in Kilobytes per second. |
|
| 45 |
+
| `encoder_util_perc` | `Float32` | NVIDIA Encoder (NVENC) utilization percentage. |
|
| 46 |
+
| `decoder_util_perc` | `Float32` | NVIDIA Decoder (NVDEC) utilization percentage. |
|
| 47 |
+
| `mangohud_active` | `Boolean` | Whether MangoHud overlay telemetry was active during the snapshot. |
|
| 48 |
+
| `cpu_tctl_c` | `Float32` | Primary CPU package temperature (Tctl). |
|
| 49 |
+
| `cpu_ccd1_c` | `Float32` | Temperature of CPU Core Complex Die 1. |
|
| 50 |
+
| `cpu_ccd2_c` | `Float32` | Temperature of CPU Core Complex Die 2. |
|
| 51 |
+
| `throttle_reasons_bitmask`| `UInt64` | Bitmask defining hardware throttling events (Power, Thermal, Sync) - acts as Inhibitory signals. |
|
| 52 |
|
| 53 |
+
### Usage with Hugging Face `datasets`
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 54 |
|
| 55 |
+
You can seamlessly integrate this telemetry into your Neuromorphic modeling workflows using the Hugging Face `datasets` library.
|
|
|
|
| 56 |
|
| 57 |
+
```python
|
| 58 |
+
from datasets import load_dataset
|
| 59 |
+
import pyarrow.parquet as pq
|
| 60 |
|
| 61 |
+
# Load the entire telemetry dataset as a single stream
|
| 62 |
+
dataset = load_dataset("rmems/Metis-SMoE-Latent-Telemetry", split="train")
|
|
|
|
| 63 |
|
| 64 |
+
print(dataset.features)
|
| 65 |
+
print(dataset[0])
|
| 66 |
+
```
|
| 67 |
|
| 68 |
+
### Export to Canonical CSV (For Corinth Canal Replay)
|
| 69 |
+
If you are using the Spikenaut `corinth-canal` framework, you can export a canonical CSV by grabbing a single dataset file:
|
| 70 |
|
| 71 |
+
```bash
|
| 72 |
+
cargo run --bin export_csv data/train/system_telemetry_v1_batch_1.parquet canonical.csv
|
| 73 |
+
```
|
| 74 |
|
| 75 |
+
### License
|
| 76 |
+
This dataset is distributed under the GPL-3.0 License.
|