model string | embedding_dim int64 | chunk_size int64 | chunk_overlap int64 | total_chunks int64 | batch_size int64 | build_date string | build_gpu string | version string |
|---|---|---|---|---|---|---|---|---|
nvidia/llama-nemotron-embed-1b-v2 | 2,048 | 1,500 | 200 | 571,651 | 256 | 2026-01-03T12:02:26.689027 | NVIDIA H100 80GB HBM3 | v10-nemotron |
π§¬β‘ STXBP1-ARIA RAG Database v10.1 - NVIDIA Nemotron Embeddings
The most advanced RAG database for STXBP1 therapeutic research.
A pre-built ChromaDB vector database containing: 571,816 indexed text chunks from ~17,000 curated PubMed Central (PMC) biomedical papers + 165 base editing analysis entries, (https://huggingface.co/datasets/SkyWhal3/stxbp1-base-editing-sweep), embedded with NVIDIA's state-of-the-art Llama-Nemotron-Embed-1B-v2 model featuring 2048-dimensional embeddings.
β‘ This is the premium GPU-accelerated version β Nemotron embeddings deliver maximum semantic precision for therapeutic queries, but require a GPU with 2-4GB VRAM. For a lightweight CPU-friendly alternative, see: STXBP1-RAG-Database (BGE)
π v10.1 Update: Base Editing Parameter Sweep Data
NEW! This version includes 165 curated entries from our exhaustive base editing compatibility analysis:
| Metric | Value |
|---|---|
| Total Analyses | 654,595,200 |
| Variants Analyzed | 171 pathogenic STXBP1 variants |
| Parameters Swept | PAM position, edit window, enzyme type |
| Mouse Compatibility | Wild-type mouse modeling feasibility |
What's Included
Each variant entry contains:
- β Compatibility Score (0-100) for adenine/cytosine base editing
- β Optimal Parameters (SpCas9/SpCas9-NG, PAM position, edit window)
- β Mouse Model Compatibility β Can a WT mouse model this human variant?
- β Clinical Context β ClinVar classification, mutation type
Key Findings Embedded
| Finding | Details |
|---|---|
| Perfect Score Variants | 5 variants achieve 100/100 compatibility |
| K196X (c.586A>T) | Score: 95.9/100, WT mouse compatible β |
| ABE-Curable | 68 variants amenable to adenine base editing |
| CBE-Curable | 47 variants amenable to cytosine base editing |
π Full sweep data available at: SkyWhal3/stxbp1-base-editing-sweep
π Why Nemotron?
NVIDIA's Nemotron embedding model ranks #2 on MTEB retrieval benchmarks β distilled from their 8B flagship into an efficient 1B parameter model.
| Feature | BGE (v9) | Nemotron (v10) |
|---|---|---|
| Embedding Dims | 768 | 2048 β¬οΈ 2.7x |
| Model Params | 110M | 1B β¬οΈ 9x |
| MTEB Retrieval | ~63 | ~69 β¬οΈ +6 pts |
| Semantic Precision | Good | Excellent |
| Hardware | CPU OK | GPU recommended |
| Optional Reranker | β | β Available |
What 2048 Dimensions Means
Semantic Space Visualization:
768 dims (BGE) 2048 dims (Nemotron)
βββββββββββββββββ βββββββββββββββββββββ
β β
/|\ / | \
/ | \ / | \
β β β β β β
Good separation Rich semantic space!
Fine-grained distinctions
Real-world impact:
- "haploinsufficiency" vs "dominant negative" β better separated
- "4-PBA chaperone" vs "AAV gene therapy" β distinct clusters
- "K196X nonsense" vs "R406H missense" β clear differentiation
- "ABE compatible" vs "prime editing required" β therapeutic routing
π Dataset Statistics
| Metric | Value |
|---|---|
| Total Chunks | 571,816 |
| Source Papers | ~17,000 PMC articles |
| Curated Entries | 24 expert-written |
| Base Editing Entries | 165 (v10.1) |
| Database Size | ~11.5 GB |
| Embedding Model | nvidia/llama-nemotron-embed-1b-v2 |
| Embedding Dimensions | 2048 |
| Model Parameters | 1B |
| Chunk Size | ~1500 chars with 200 char overlap |
| Index Type | ChromaDB with HNSW |
| Build Hardware | NVIDIA H100 80GB |
| Last Updated | January 6, 2026 |
π― Purpose
This database powers STXBP1-ARIA MAX, the premium therapeutic discovery system, enabling:
- Maximum retrieval precision for complex therapeutic queries
- Fine-grained semantic distinctions between mutation types and mechanisms
- Base editing feasibility lookups for specific variants
- Mouse model compatibility guidance for preclinical research
- Optional reranking with Nemotron cross-encoder for top-k refinement
- Literature-grounded responses with PMC citations
π Related Resources
| Resource | Link |
|---|---|
| ARIA MAX (Live Demo) | HuggingFace Space |
| Base Editing Sweep | Dataset |
| Evaluation Harness | STXBP1-Eval |
| CPU Version (BGE) | STXBP1-RAG-Database |
π Quick Start
Load in Python
import chromadb
from chromadb.config import Settings
# Connect to database
client = chromadb.PersistentClient(
path="./STXBP1-RAG-Nemotron",
settings=Settings(anonymized_telemetry=False)
)
collection = client.get_collection("stxbp1_papers")
print(f"Loaded {collection.count():,} chunks")
Query with Nemotron Embeddings
from sentence_transformers import SentenceTransformer
# Load embedding model (requires GPU for best performance)
embedder = SentenceTransformer(
"nvidia/llama-nemotron-embed-1b-v2",
trust_remote_code=True
)
# Embed query
query = "What base editing options exist for K196X?"
query_embedding = embedder.encode(query).tolist()
# Search
results = collection.query(
query_embeddings=[query_embedding],
n_results=10
)
for doc, meta in zip(results['documents'][0], results['metadatas'][0]):
print(f"[{meta.get('pmcid', 'N/A')}] {doc[:200]}...")
π Files
| File | Size | Description |
|---|---|---|
chroma.sqlite3 |
~2 GB | Metadata + document store |
{uuid}/data_level0.bin |
~9 GB | HNSW vector index |
{uuid}/header.bin |
<1 KB | Index metadata |
{uuid}/length.bin |
<1 KB | Vector dimensions |
π License
Apache 2.0
π Acknowledgments
Built for the STXBP1 research community. Special thanks to:
- STXBP1 Foundation
- NVIDIA for Nemotron embeddings
- The families and researchers working toward treatments
Built with β€οΈ for rare disease research
Last Updated: January 6, 2026
- Downloads last month
- 21