rmems's picture
Data directories getting their own README.md to practice better engineering habbits.
7febdb3 unverified
# First Day Testing Real Weights: Metis Semantic Routing
This directory serves as the chronological forensic log of the first day the the was injected with real Large Language Model embeddings on the Ship of Theseus workstation.
Moving away from synthetic baseline tests, this phase integrated the **`OLMoE-1B-7B-0125-Instruct-GGUF`** model. The objective was to observe how biologically-inspired SNN fatigue mechanics handle the dense semantic pressure of a Mixture-of-Experts (MoE) architecture using custom Rust and CUDA ML integrations. To test if the L2 Normalization was actually working, we needed to see if the network could adapt to different types of data. Had each test been identical, it would have indicated that the normalization created a "lazy resting state." Instead, each test showed different routing patterns, proving the network was dynamically adapting.
## How it mimics a biological system
- **L2 Normalization**: Prevents any single neuron from becoming too dominant, mimicking how biological brains distribute energy across networks.
- **Spike-based routing**: Walkers (spikes) physically explore the network, finding the path of least resistance, just like electrical impulses in a biological brain.
- **Fatigue mechanics**: Neurons that fire too much become less responsive, preventing energy overload and allowing the network to adapt.
## SNN Physics: The "Walker"
In these datasets, the Y-axis represents the **Walker**. The walker acts as a pulse of electrical energy (a spike). Because this is a biological system, energy cannot flow everywhere at once. The network must test different paths. The walker's goal is to physically explore the network to find the routing pathway with the *least electrical resistance*.
## The Progression of Discovery
### 1. `first-test-failed/` : The Routing Collapse (Blackhole)
* **The Issue:** A misconfiguration in the CUDA kernels, which were searching for F32 data instead of F16.
* **The Result:** The voltage energy was so high that a single walker (around ID 620) got permanently blasted with energy. It became a "blackhole," causing a routing collapse where the energy could not disperse and explore the network.
### 2. `second-test/` : Attractor Discovery (L2 Normalization)
* **The Input:** `"Teaching OMLoE the language of SNN"`
* **The Fix:** L2 Normalization was applied to the voltage.
* **The Result:** The normalization allowed the walker to naturally explore the network and find the path of least resistance. The energy settled into a comfortable state, routing primarily through **Walker 2000**, with secondary echoes in Walkers 700 and 1450.
### 3. `third-test/` : Rust Syntax (The True Victory)
* **The Input:** `fn main () { println!(); }`
* **The Result:** The routing changed completely from the second test. This was the massive win: it proved that the L2 Normalization forced the hardware to dynamically adapt to the data. If the graph had looked the same as the English prompt, it would have indicated that the normalization created a "lazy resting state." Instead, the code syntax was physically routed to a completely different biological neighborhood.
### 4. `fourth-test/` : Math Logic Clustering
* **The Input:** `"The derivative of a constant is mathematically zero."`
* **The Result:** The network routed the mathematical logic into the exact same frequency band used by the Rust syntax. This proved **Semantic Attractor Clustering**—the SNN physically maps rigid, structured logic tasks (math and code) to the same adjacent physical pathways to conserve energy.
## Biggest victory? ##
* **New custom dataset** for future SymbolicRegression.jl experiments to find new underlying equations for SAAQ (Semantic Attractor Architecture Quantization)
## Environment & Architecture
* **Model:**
* **Compute:** ASUS ProArt GeForce RTX 5080 (16GB VRAM) | AMD Ryzen 9 9950X
* **Workstation:** Ship of Theseus (Fedora 43)
* **Actual Biological Implementation:** Custom Rust/CUDA corinth-canal for SNN quantization research discovery
* **Visualizer and Mathematical Analysis:** Surrogate_Viz.jl