Upload liqmamba/__init__.py
Browse files- liqmamba/__init__.py +27 -0
liqmamba/__init__.py
ADDED
|
@@ -0,0 +1,27 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
LiqMamba: Liquid-Mamba Image Generator
|
| 3 |
+
|
| 4 |
+
A novel lightweight architecture combining:
|
| 5 |
+
- Liquid Time-Constant (CfC) networks for adaptive continuous-time gating
|
| 6 |
+
- Mamba-2 State Space Duality (SSD) for linear-time sequence processing
|
| 7 |
+
- Flow Matching for stable image generation
|
| 8 |
+
- Multi-directional 2D scans for image understanding
|
| 9 |
+
- ConFIG gradient stabilization (from PINN research)
|
| 10 |
+
|
| 11 |
+
Key innovations:
|
| 12 |
+
1. CfC-Gated Mamba blocks: Replace static nonlinearities with learnable
|
| 13 |
+
continuous-time dynamics that adapt computation depth per-token
|
| 14 |
+
2. Liquid State Modulation: The SSM state transition is modulated by CfC
|
| 15 |
+
dynamics, giving the model ODE-inspired expressivity
|
| 16 |
+
3. Physics-informed training: ConFIG gradient composition prevents
|
| 17 |
+
competing loss terms from destabilizing training
|
| 18 |
+
4. Extremely lightweight: ~25M params, trainable on Colab free tier
|
| 19 |
+
|
| 20 |
+
Paper References:
|
| 21 |
+
- CfC: "Closed-form Continuous-time Neural Networks" (Hasani et al., 2021)
|
| 22 |
+
- Mamba-2: "Transformers are SSMs" (Dao & Gu, 2024)
|
| 23 |
+
- DiM: "Diffusion Mamba" (Teng et al., 2024)
|
| 24 |
+
- ConFIG: "Towards Conflict-free Training of PINNs" (Liu et al., 2024)
|
| 25 |
+
"""
|
| 26 |
+
|
| 27 |
+
__version__ = "0.1.0"
|