Merge pull request #4 from Shouvik599/feature-multi-turn-converse
Browse files- README.md +102 -18
- app.py +127 -42
- frontend/index.html +302 -437
- rag_chain.py +178 -198
README.md
CHANGED
|
@@ -10,7 +10,7 @@ pinned: false
|
|
| 10 |
|
| 11 |
# ποΈ Sacred Texts RAG β Multi-Religion Knowledge Base
|
| 12 |
|
| 13 |
-
A Retrieval-Augmented Generation (RAG) application that answers spiritual queries using Bhagavad Gita, Quran, Bible and
|
| 14 |
|
| 15 |
---
|
| 16 |
|
|
@@ -22,10 +22,10 @@ sacred-texts-rag/
|
|
| 22 |
βββ requirements.txt
|
| 23 |
βββ .env.example
|
| 24 |
βββ ingest.py # Step 1: Load PDFs β chunk β embed β store
|
| 25 |
-
βββ rag_chain.py # Core RAG chain logic
|
| 26 |
βββ app.py # FastAPI backend server
|
| 27 |
βββ frontend/
|
| 28 |
-
βββ index.html # Chat UI (
|
| 29 |
```
|
| 30 |
|
| 31 |
---
|
|
@@ -49,7 +49,7 @@ Place your PDF files in a `books/` folder:
|
|
| 49 |
books/
|
| 50 |
βββ bhagavad_gita.pdf
|
| 51 |
βββ quran.pdf
|
| 52 |
-
|
| 53 |
βββ guru_granth_sahib.pdf
|
| 54 |
```
|
| 55 |
|
|
@@ -67,20 +67,24 @@ This will:
|
|
| 67 |
```bash
|
| 68 |
python app.py
|
| 69 |
```
|
| 70 |
-
Server runs at: `http://localhost:
|
| 71 |
|
| 72 |
### 6. Open the Frontend
|
| 73 |
-
|
| 74 |
|
| 75 |
---
|
| 76 |
|
| 77 |
## π Environment Variables
|
| 78 |
|
| 79 |
-
| Variable | Description |
|
| 80 |
-
|---|---|
|
| 81 |
-
| `NVIDIA_API_KEY` | Your NVIDIA API key |
|
| 82 |
-
| `CHROMA_DB_PATH` | Path to ChromaDB storage
|
| 83 |
-
| `
|
|
|
|
|
|
|
|
|
|
|
|
|
| 84 |
|
| 85 |
---
|
| 86 |
|
|
@@ -90,30 +94,110 @@ Open `frontend/index.html` in your browser β no server needed for the UI.
|
|
| 90 |
User Query
|
| 91 |
β
|
| 92 |
βΌ
|
| 93 |
-
[
|
| 94 |
β
|
| 95 |
βΌ
|
| 96 |
-
[
|
| 97 |
-
β (retrieves top-K chunks from Gita, Quran, Bible, and the Guru Granth Sahib)
|
| 98 |
β
|
| 99 |
βΌ
|
| 100 |
-
[
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 101 |
β
|
| 102 |
βΌ
|
| 103 |
[Llama-3.3-70b-instruct] βββ Answer grounded ONLY in retrieved texts
|
| 104 |
β
|
| 105 |
βΌ
|
| 106 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 107 |
```
|
|
|
|
| 108 |
|
| 109 |
---
|
| 110 |
|
| 111 |
## π Notes
|
| 112 |
|
| 113 |
- The LLM is instructed **never** to answer from outside the provided texts
|
| 114 |
-
- Each response includes **source citations** (
|
| 115 |
- Responses synthesize wisdom **across all books** when relevant
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 116 |
|
| 117 |
## π¬ Demo
|
| 118 |
|
| 119 |
-
App Link
|
|
|
|
| 10 |
|
| 11 |
# ποΈ Sacred Texts RAG β Multi-Religion Knowledge Base
|
| 12 |
|
| 13 |
+
A Retrieval-Augmented Generation (RAG) application that answers spiritual queries using the Bhagavad Gita, Quran, Bible, and Guru Granth Sahib as the sole knowledge sources. Now with **multi-turn conversation memory** β ask follow-up questions naturally, just like a real dialogue.
|
| 14 |
|
| 15 |
---
|
| 16 |
|
|
|
|
| 22 |
βββ requirements.txt
|
| 23 |
βββ .env.example
|
| 24 |
βββ ingest.py # Step 1: Load PDFs β chunk β embed β store
|
| 25 |
+
βββ rag_chain.py # Core RAG chain logic (with session memory)
|
| 26 |
βββ app.py # FastAPI backend server
|
| 27 |
βββ frontend/
|
| 28 |
+
βββ index.html # Chat UI (served by FastAPI)
|
| 29 |
```
|
| 30 |
|
| 31 |
---
|
|
|
|
| 49 |
books/
|
| 50 |
βββ bhagavad_gita.pdf
|
| 51 |
βββ quran.pdf
|
| 52 |
+
βββ bible.pdf
|
| 53 |
βββ guru_granth_sahib.pdf
|
| 54 |
```
|
| 55 |
|
|
|
|
| 67 |
```bash
|
| 68 |
python app.py
|
| 69 |
```
|
| 70 |
+
Server runs at: `http://localhost:7860`
|
| 71 |
|
| 72 |
### 6. Open the Frontend
|
| 73 |
+
Navigate to `http://localhost:7860` in your browser β the FastAPI server serves the UI directly.
|
| 74 |
|
| 75 |
---
|
| 76 |
|
| 77 |
## π Environment Variables
|
| 78 |
|
| 79 |
+
| Variable | Description | Default |
|
| 80 |
+
|---|---|---|
|
| 81 |
+
| `NVIDIA_API_KEY` | Your NVIDIA API key | β |
|
| 82 |
+
| `CHROMA_DB_PATH` | Path to ChromaDB storage | `./chroma_db` |
|
| 83 |
+
| `COLLECTION_NAME` | ChromaDB collection name | `sacred_texts` |
|
| 84 |
+
| `CHUNKS_PER_BOOK` | Chunks retrieved per book per query | `3` |
|
| 85 |
+
| `MAX_HISTORY_TURNS` | Max conversation turns kept in memory per session | `6` |
|
| 86 |
+
| `HOST` | Server bind host | `0.0.0.0` |
|
| 87 |
+
| `PORT` | Server port | `7860` |
|
| 88 |
|
| 89 |
---
|
| 90 |
|
|
|
|
| 94 |
User Query
|
| 95 |
β
|
| 96 |
βΌ
|
| 97 |
+
[Session Memory] βββ Injects prior conversation turns into LLM context
|
| 98 |
β
|
| 99 |
βΌ
|
| 100 |
+
[Query Augmentation] βββ Short follow-ups are enriched with previous question
|
|
|
|
| 101 |
β
|
| 102 |
βΌ
|
| 103 |
+
[Hybrid Retrieval: BM25 + Vector Search] βββ Per-book guaranteed slots
|
| 104 |
+
β
|
| 105 |
+
βΌ
|
| 106 |
+
[NVIDIA Reranker] βββ llama-3.2-nv-rerankqa-1b-v2 re-scores pooled candidates
|
| 107 |
+
β
|
| 108 |
+
βΌ
|
| 109 |
+
[Semantic Cache Check] βββ Skip LLM if a similar question was answered before
|
| 110 |
+
β
|
| 111 |
+
βΌ
|
| 112 |
+
[Prompt with Context + History]
|
| 113 |
β
|
| 114 |
βΌ
|
| 115 |
[Llama-3.3-70b-instruct] βββ Answer grounded ONLY in retrieved texts
|
| 116 |
β
|
| 117 |
βΌ
|
| 118 |
+
Streamed response with source citations (book + chapter/verse)
|
| 119 |
+
```
|
| 120 |
+
|
| 121 |
+
---
|
| 122 |
+
|
| 123 |
+
## π¬ Multi-Turn Conversation
|
| 124 |
+
|
| 125 |
+
The app maintains per-session conversation history so you can ask natural follow-up questions:
|
| 126 |
+
|
| 127 |
+
```
|
| 128 |
+
You: "What do the scriptures say about forgiveness?"
|
| 129 |
+
AI: [Answer citing Gita, Quran, Bible, Guru Granth Sahib]
|
| 130 |
+
|
| 131 |
+
You: "Elaborate on the second point" β follow-up, no context needed
|
| 132 |
+
AI: [Continues from previous answer]
|
| 133 |
+
|
| 134 |
+
You: "What does the Bible say specifically?" β drill-down
|
| 135 |
+
AI: [Focuses on Bible passages from the thread]
|
| 136 |
+
```
|
| 137 |
+
|
| 138 |
+
**How sessions work:**
|
| 139 |
+
- A session ID is created automatically on your first question and persisted in the browser's `localStorage`
|
| 140 |
+
- The server keeps the last `MAX_HISTORY_TURNS` (default: 6) human+AI pairs in memory
|
| 141 |
+
- Click **βΊ New Conversation** in the header to clear history and start fresh
|
| 142 |
+
- Sessions are scoped to the server process β they reset on server restart
|
| 143 |
+
|
| 144 |
+
---
|
| 145 |
+
|
| 146 |
+
## π API Endpoints
|
| 147 |
+
|
| 148 |
+
| Method | Endpoint | Description |
|
| 149 |
+
|---|---|---|
|
| 150 |
+
| `POST` | `/ask` | Ask a question; streams NDJSON response |
|
| 151 |
+
| `POST` | `/clear` | Clear conversation history for a session |
|
| 152 |
+
| `GET` | `/history` | Inspect conversation history for a session |
|
| 153 |
+
| `GET` | `/books` | List all books indexed in the knowledge base |
|
| 154 |
+
| `GET` | `/health` | Health check |
|
| 155 |
+
| `GET` | `/` | Serves the frontend UI |
|
| 156 |
+
| `GET` | `/docs` | Swagger UI |
|
| 157 |
+
|
| 158 |
+
### `/ask` Request Body
|
| 159 |
+
```json
|
| 160 |
+
{
|
| 161 |
+
"question": "What do the scriptures say about compassion?",
|
| 162 |
+
"session_id": "optional-uuid-string"
|
| 163 |
+
}
|
| 164 |
+
```
|
| 165 |
+
|
| 166 |
+
### `/ask` Response (streamed NDJSON)
|
| 167 |
+
```json
|
| 168 |
+
{"type": "token", "data": "The Bhagavad Gita teaches..."}
|
| 169 |
+
{"type": "token", "data": " compassion as..."}
|
| 170 |
+
{"type": "sources", "data": [{"book": "Bhagavad Gita 2:47", "page": "2:47", "snippet": "..."}]}
|
| 171 |
```
|
| 172 |
+
Cache hits return a single `{"type": "cache", "data": {"answer": "...", "sources": [...]}}` line.
|
| 173 |
|
| 174 |
---
|
| 175 |
|
| 176 |
## π Notes
|
| 177 |
|
| 178 |
- The LLM is instructed **never** to answer from outside the provided texts
|
| 179 |
+
- Each response includes **source citations** (book + chapter/verse where available)
|
| 180 |
- Responses synthesize wisdom **across all books** when relevant
|
| 181 |
+
- The semantic cache skips the LLM for repeated or near-identical questions (cosine distance < 0.35)
|
| 182 |
+
- Follow-up retrieval automatically augments vague short queries with the previous question for better semantic matching
|
| 183 |
+
|
| 184 |
+
---
|
| 185 |
+
|
| 186 |
+
## πΊοΈ Planned Features
|
| 187 |
+
|
| 188 |
+
- Contextual chunk expansion (fetch Β±1 surrounding chunks)
|
| 189 |
+
- HyDE β Hypothetical Document Embedding for abstract queries
|
| 190 |
+
- Answer faithfulness scoring (LLM-as-judge)
|
| 191 |
+
- Query rewriting for vague inputs
|
| 192 |
+
- Snippet preview on source hover
|
| 193 |
+
- Query suggestions after each answer
|
| 194 |
+
- Compare mode β side-by-side view across books
|
| 195 |
+
- Hallucination guardrail
|
| 196 |
+
- Out-of-scope detection
|
| 197 |
+
- Rate limiting & API key hardening
|
| 198 |
+
|
| 199 |
+
---
|
| 200 |
|
| 201 |
## π¬ Demo
|
| 202 |
|
| 203 |
+
App Link: https://shouvik99-lifeguide.hf.space/
|
app.py
CHANGED
|
@@ -2,7 +2,9 @@
|
|
| 2 |
app.py β FastAPI backend server for the Sacred Texts RAG application.
|
| 3 |
|
| 4 |
Endpoints:
|
| 5 |
-
POST /ask β Ask a question, get
|
|
|
|
|
|
|
| 6 |
GET /health β Health check
|
| 7 |
GET /books β List books currently in the knowledge base
|
| 8 |
|
|
@@ -11,13 +13,20 @@ Run with:
|
|
| 11 |
"""
|
| 12 |
|
| 13 |
import os
|
| 14 |
-
|
|
|
|
| 15 |
from fastapi.middleware.cors import CORSMiddleware
|
| 16 |
from pydantic import BaseModel, Field
|
| 17 |
from dotenv import load_dotenv
|
| 18 |
-
from fastapi.responses import StreamingResponse, FileResponse
|
| 19 |
-
from rag_chain import
|
| 20 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 21 |
|
| 22 |
load_dotenv()
|
| 23 |
|
|
@@ -26,34 +35,54 @@ load_dotenv()
|
|
| 26 |
app = FastAPI(
|
| 27 |
title="Sacred Texts RAG API",
|
| 28 |
description="Ask questions answered exclusively from Bhagavad Gita, Quran, Bible, and Guru Granth Sahib",
|
| 29 |
-
version="
|
| 30 |
)
|
| 31 |
|
| 32 |
-
# Allow requests from the local frontend (index.html opened as file://)
|
| 33 |
app.add_middleware(
|
| 34 |
CORSMiddleware,
|
| 35 |
-
allow_origins=["*"],
|
| 36 |
allow_credentials=True,
|
| 37 |
allow_methods=["*"],
|
| 38 |
allow_headers=["*"],
|
|
|
|
| 39 |
)
|
| 40 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 41 |
|
| 42 |
# βββ Request / Response Models ββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 43 |
|
| 44 |
class AskRequest(BaseModel):
|
| 45 |
question: str = Field(..., min_length=3, max_length=1000,
|
| 46 |
example="What do the scriptures say about compassion?")
|
| 47 |
-
|
| 48 |
-
|
| 49 |
-
|
| 50 |
-
|
| 51 |
-
|
| 52 |
-
|
| 53 |
-
class AskResponse(BaseModel):
|
| 54 |
-
question: str
|
| 55 |
-
answer: str
|
| 56 |
-
sources: list[Source]
|
| 57 |
|
| 58 |
class HealthResponse(BaseModel):
|
| 59 |
status: str
|
|
@@ -63,49 +92,67 @@ class BooksResponse(BaseModel):
|
|
| 63 |
books: list[str]
|
| 64 |
total_chunks: int
|
| 65 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 66 |
|
| 67 |
# βββ Routes βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 68 |
|
| 69 |
@app.get("/health", response_model=HealthResponse, tags=["System"])
|
| 70 |
def health_check():
|
| 71 |
-
"""Check that the API is running."""
|
| 72 |
return {"status": "ok", "message": "Sacred Texts RAG is running ποΈ"}
|
| 73 |
|
| 74 |
|
| 75 |
@app.get("/books", response_model=BooksResponse, tags=["Knowledge Base"])
|
| 76 |
def list_books():
|
| 77 |
-
"""List all books currently indexed in the knowledge base."""
|
| 78 |
try:
|
| 79 |
-
embeddings
|
| 80 |
-
vector_store = get_vector_store(embeddings)
|
| 81 |
-
collection
|
| 82 |
-
results
|
| 83 |
-
metadatas
|
| 84 |
-
|
| 85 |
-
books = sorted(set(
|
| 86 |
-
m.get("book", "Unknown")
|
| 87 |
-
for m in metadatas
|
| 88 |
-
if m # guard against None
|
| 89 |
-
))
|
| 90 |
return {"books": books, "total_chunks": len(metadatas)}
|
| 91 |
except Exception as e:
|
| 92 |
raise HTTPException(status_code=500, detail=f"Could not read knowledge base: {e}")
|
| 93 |
|
| 94 |
|
| 95 |
@app.post("/ask", tags=["Query"])
|
| 96 |
-
async def ask(
|
| 97 |
"""
|
| 98 |
Ask a spiritual or philosophical question.
|
| 99 |
-
|
|
|
|
|
|
|
| 100 |
"""
|
| 101 |
-
if not
|
| 102 |
raise HTTPException(status_code=400, detail="Question cannot be empty.")
|
| 103 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 104 |
try:
|
| 105 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 106 |
return StreamingResponse(
|
| 107 |
-
|
| 108 |
-
|
|
|
|
| 109 |
)
|
| 110 |
except FileNotFoundError:
|
| 111 |
raise HTTPException(
|
|
@@ -115,26 +162,64 @@ async def ask(request: AskRequest):
|
|
| 115 |
except Exception as e:
|
| 116 |
raise HTTPException(status_code=500, detail=str(e))
|
| 117 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 118 |
@app.get("/", include_in_schema=False)
|
| 119 |
async def serve_frontend():
|
| 120 |
-
"""Serves the static frontend HTML file."""
|
| 121 |
frontend_path = "frontend/index.html"
|
| 122 |
if os.path.exists(frontend_path):
|
| 123 |
return FileResponse(frontend_path)
|
| 124 |
return {"message": "Sacred Texts RAG API is live. Visit /docs for Swagger UI."}
|
| 125 |
|
|
|
|
| 126 |
# βββ Entry Point ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 127 |
|
| 128 |
if __name__ == "__main__":
|
| 129 |
import uvicorn
|
| 130 |
|
| 131 |
-
# HF Spaces uses 7860 by default
|
| 132 |
host = os.getenv("HOST", "0.0.0.0")
|
| 133 |
-
port = int(os.getenv("PORT", "7860"))
|
| 134 |
|
| 135 |
-
print(f"\nποΈ Sacred Texts RAG β API Server")
|
| 136 |
print(f"{'β' * 40}")
|
| 137 |
print(f"π Running at : http://{host}:{port}")
|
|
|
|
| 138 |
print(f"{'β' * 40}\n")
|
| 139 |
|
| 140 |
-
uvicorn.run("app:app", host=host, port=port, reload=False)
|
|
|
|
| 2 |
app.py β FastAPI backend server for the Sacred Texts RAG application.
|
| 3 |
|
| 4 |
Endpoints:
|
| 5 |
+
POST /ask β Ask a question, get a streamed answer with sources
|
| 6 |
+
POST /clear β Clear conversation history for a session
|
| 7 |
+
GET /history β Retrieve conversation history for a session
|
| 8 |
GET /health β Health check
|
| 9 |
GET /books β List books currently in the knowledge base
|
| 10 |
|
|
|
|
| 13 |
"""
|
| 14 |
|
| 15 |
import os
|
| 16 |
+
import uuid
|
| 17 |
+
from fastapi import FastAPI, HTTPException, Request, Response
|
| 18 |
from fastapi.middleware.cors import CORSMiddleware
|
| 19 |
from pydantic import BaseModel, Field
|
| 20 |
from dotenv import load_dotenv
|
| 21 |
+
from fastapi.responses import StreamingResponse, FileResponse, JSONResponse
|
| 22 |
+
from rag_chain import (
|
| 23 |
+
query_sacred_texts,
|
| 24 |
+
get_embeddings,
|
| 25 |
+
get_vector_store,
|
| 26 |
+
clear_session,
|
| 27 |
+
get_history,
|
| 28 |
+
)
|
| 29 |
+
from langchain_core.messages import HumanMessage, AIMessage
|
| 30 |
|
| 31 |
load_dotenv()
|
| 32 |
|
|
|
|
| 35 |
app = FastAPI(
|
| 36 |
title="Sacred Texts RAG API",
|
| 37 |
description="Ask questions answered exclusively from Bhagavad Gita, Quran, Bible, and Guru Granth Sahib",
|
| 38 |
+
version="2.0.0",
|
| 39 |
)
|
| 40 |
|
|
|
|
| 41 |
app.add_middleware(
|
| 42 |
CORSMiddleware,
|
| 43 |
+
allow_origins=["*"],
|
| 44 |
allow_credentials=True,
|
| 45 |
allow_methods=["*"],
|
| 46 |
allow_headers=["*"],
|
| 47 |
+
expose_headers=["X-Session-Id"],
|
| 48 |
)
|
| 49 |
|
| 50 |
+
SESSION_COOKIE = "rag_session_id"
|
| 51 |
+
|
| 52 |
+
|
| 53 |
+
# βββ Helpers βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 54 |
+
|
| 55 |
+
def get_or_create_session(request: Request, response: Response) -> str:
|
| 56 |
+
"""
|
| 57 |
+
Read the session ID from the cookie (or X-Session-Id header).
|
| 58 |
+
If absent, generate a new one and set it on the response cookie.
|
| 59 |
+
"""
|
| 60 |
+
session_id = (
|
| 61 |
+
request.cookies.get(SESSION_COOKIE)
|
| 62 |
+
or request.headers.get("X-Session-Id")
|
| 63 |
+
)
|
| 64 |
+
if not session_id:
|
| 65 |
+
session_id = str(uuid.uuid4())
|
| 66 |
+
response.set_cookie(
|
| 67 |
+
key=SESSION_COOKIE,
|
| 68 |
+
value=session_id,
|
| 69 |
+
httponly=True,
|
| 70 |
+
samesite="lax",
|
| 71 |
+
max_age=60 * 60 * 24, # 24 hours
|
| 72 |
+
)
|
| 73 |
+
return session_id
|
| 74 |
+
|
| 75 |
|
| 76 |
# βββ Request / Response Models ββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 77 |
|
| 78 |
class AskRequest(BaseModel):
|
| 79 |
question: str = Field(..., min_length=3, max_length=1000,
|
| 80 |
example="What do the scriptures say about compassion?")
|
| 81 |
+
session_id: str | None = Field(
|
| 82 |
+
default=None,
|
| 83 |
+
description="Optional session ID for multi-turn conversations. "
|
| 84 |
+
"If omitted, the server reads/creates one via cookie.",
|
| 85 |
+
)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 86 |
|
| 87 |
class HealthResponse(BaseModel):
|
| 88 |
status: str
|
|
|
|
| 92 |
books: list[str]
|
| 93 |
total_chunks: int
|
| 94 |
|
| 95 |
+
class ClearRequest(BaseModel):
|
| 96 |
+
session_id: str | None = None
|
| 97 |
+
|
| 98 |
+
class HistoryItem(BaseModel):
|
| 99 |
+
role: str # "human" | "ai"
|
| 100 |
+
content: str
|
| 101 |
+
|
| 102 |
+
class HistoryResponse(BaseModel):
|
| 103 |
+
session_id: str
|
| 104 |
+
turns: int
|
| 105 |
+
messages: list[HistoryItem]
|
| 106 |
+
|
| 107 |
|
| 108 |
# βββ Routes βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 109 |
|
| 110 |
@app.get("/health", response_model=HealthResponse, tags=["System"])
|
| 111 |
def health_check():
|
|
|
|
| 112 |
return {"status": "ok", "message": "Sacred Texts RAG is running ποΈ"}
|
| 113 |
|
| 114 |
|
| 115 |
@app.get("/books", response_model=BooksResponse, tags=["Knowledge Base"])
|
| 116 |
def list_books():
|
|
|
|
| 117 |
try:
|
| 118 |
+
embeddings = get_embeddings()
|
| 119 |
+
vector_store = get_vector_store(embeddings)
|
| 120 |
+
collection = vector_store._collection
|
| 121 |
+
results = collection.get(include=["metadatas"])
|
| 122 |
+
metadatas = results.get("metadatas", [])
|
| 123 |
+
books = sorted(set(m.get("book", "Unknown") for m in metadatas if m))
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 124 |
return {"books": books, "total_chunks": len(metadatas)}
|
| 125 |
except Exception as e:
|
| 126 |
raise HTTPException(status_code=500, detail=f"Could not read knowledge base: {e}")
|
| 127 |
|
| 128 |
|
| 129 |
@app.post("/ask", tags=["Query"])
|
| 130 |
+
async def ask(request_body: AskRequest, request: Request, response: Response):
|
| 131 |
"""
|
| 132 |
Ask a spiritual or philosophical question.
|
| 133 |
+
Streams the answer as NDJSON (one JSON object per line).
|
| 134 |
+
Maintains per-session conversation history automatically via cookie or
|
| 135 |
+
the `session_id` field in the request body.
|
| 136 |
"""
|
| 137 |
+
if not request_body.question.strip():
|
| 138 |
raise HTTPException(status_code=400, detail="Question cannot be empty.")
|
| 139 |
|
| 140 |
+
# Resolve session: body field > cookie/header > new
|
| 141 |
+
if request_body.session_id:
|
| 142 |
+
session_id = request_body.session_id
|
| 143 |
+
else:
|
| 144 |
+
session_id = get_or_create_session(request, response)
|
| 145 |
+
|
| 146 |
try:
|
| 147 |
+
stream = query_sacred_texts(request_body.question, session_id=session_id)
|
| 148 |
+
|
| 149 |
+
# We need to forward the session_id so the frontend can persist it
|
| 150 |
+
headers = {"X-Session-Id": session_id}
|
| 151 |
+
|
| 152 |
return StreamingResponse(
|
| 153 |
+
stream,
|
| 154 |
+
media_type="application/x-ndjson",
|
| 155 |
+
headers=headers,
|
| 156 |
)
|
| 157 |
except FileNotFoundError:
|
| 158 |
raise HTTPException(
|
|
|
|
| 162 |
except Exception as e:
|
| 163 |
raise HTTPException(status_code=500, detail=str(e))
|
| 164 |
|
| 165 |
+
|
| 166 |
+
@app.post("/clear", tags=["Session"])
|
| 167 |
+
async def clear_conversation(body: ClearRequest, request: Request, response: Response):
|
| 168 |
+
"""
|
| 169 |
+
Clear the conversation history for the given session.
|
| 170 |
+
If session_id is omitted, clears the session identified by cookie.
|
| 171 |
+
"""
|
| 172 |
+
session_id = body.session_id or request.cookies.get(SESSION_COOKIE)
|
| 173 |
+
if not session_id:
|
| 174 |
+
raise HTTPException(status_code=400, detail="No session to clear.")
|
| 175 |
+
clear_session(session_id)
|
| 176 |
+
return {"status": "cleared", "session_id": session_id}
|
| 177 |
+
|
| 178 |
+
|
| 179 |
+
@app.get("/history", response_model=HistoryResponse, tags=["Session"])
|
| 180 |
+
async def conversation_history(session_id: str | None = None, request: Request = None):
|
| 181 |
+
"""
|
| 182 |
+
Return the conversation history for a session (for debugging / display).
|
| 183 |
+
"""
|
| 184 |
+
sid = session_id or (request.cookies.get(SESSION_COOKIE) if request else None)
|
| 185 |
+
if not sid:
|
| 186 |
+
raise HTTPException(status_code=400, detail="Provide session_id query param or cookie.")
|
| 187 |
+
|
| 188 |
+
messages = get_history(sid)
|
| 189 |
+
items = []
|
| 190 |
+
for msg in messages:
|
| 191 |
+
if isinstance(msg, HumanMessage):
|
| 192 |
+
items.append(HistoryItem(role="human", content=msg.content))
|
| 193 |
+
elif isinstance(msg, AIMessage):
|
| 194 |
+
items.append(HistoryItem(role="ai", content=msg.content))
|
| 195 |
+
|
| 196 |
+
return HistoryResponse(
|
| 197 |
+
session_id=sid,
|
| 198 |
+
turns=len(items) // 2,
|
| 199 |
+
messages=items,
|
| 200 |
+
)
|
| 201 |
+
|
| 202 |
+
|
| 203 |
@app.get("/", include_in_schema=False)
|
| 204 |
async def serve_frontend():
|
|
|
|
| 205 |
frontend_path = "frontend/index.html"
|
| 206 |
if os.path.exists(frontend_path):
|
| 207 |
return FileResponse(frontend_path)
|
| 208 |
return {"message": "Sacred Texts RAG API is live. Visit /docs for Swagger UI."}
|
| 209 |
|
| 210 |
+
|
| 211 |
# βββ Entry Point ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 212 |
|
| 213 |
if __name__ == "__main__":
|
| 214 |
import uvicorn
|
| 215 |
|
|
|
|
| 216 |
host = os.getenv("HOST", "0.0.0.0")
|
| 217 |
+
port = int(os.getenv("PORT", "7860"))
|
| 218 |
|
| 219 |
+
print(f"\nποΈ Sacred Texts RAG β API Server v2.0")
|
| 220 |
print(f"{'β' * 40}")
|
| 221 |
print(f"π Running at : http://{host}:{port}")
|
| 222 |
+
print(f"π§ Multi-turn conversation: ENABLED")
|
| 223 |
print(f"{'β' * 40}\n")
|
| 224 |
|
| 225 |
+
uvicorn.run("app:app", host=host, port=port, reload=False)
|
frontend/index.html
CHANGED
|
@@ -13,13 +13,7 @@
|
|
| 13 |
|
| 14 |
<style>
|
| 15 |
/* ββ Reset & Base βββββββββββββββββββββββββββββββββββββββββββ */
|
| 16 |
-
*,
|
| 17 |
-
*::before,
|
| 18 |
-
*::after {
|
| 19 |
-
box-sizing: border-box;
|
| 20 |
-
margin: 0;
|
| 21 |
-
padding: 0;
|
| 22 |
-
}
|
| 23 |
|
| 24 |
:root {
|
| 25 |
--bg: #0d0b07;
|
|
@@ -32,68 +26,13 @@
|
|
| 32 |
--cream: #f0e6cc;
|
| 33 |
--muted: #7a6a4a;
|
| 34 |
--gita: #e07b3b;
|
| 35 |
-
/* saffron */
|
| 36 |
--quran: #3bba85;
|
| 37 |
-
/* green */
|
| 38 |
--bible: #5b8ce0;
|
| 39 |
-
/* blue */
|
| 40 |
--granth: #b07ce0;
|
| 41 |
-
|
| 42 |
-
}
|
| 43 |
-
|
| 44 |
-
/* Animated Thinking state for streaming */
|
| 45 |
-
.thinking-dots {
|
| 46 |
-
display: inline-flex;
|
| 47 |
-
gap: 4px;
|
| 48 |
-
margin-left: 4px;
|
| 49 |
-
}
|
| 50 |
-
|
| 51 |
-
.thinking-dots span {
|
| 52 |
-
width: 4px;
|
| 53 |
-
height: 4px;
|
| 54 |
-
background: var(--gold);
|
| 55 |
-
border-radius: 50%;
|
| 56 |
-
animation: bounce 1.4s infinite ease-in-out;
|
| 57 |
-
}
|
| 58 |
-
|
| 59 |
-
@keyframes bounce {
|
| 60 |
-
|
| 61 |
-
0%,
|
| 62 |
-
80%,
|
| 63 |
-
100% {
|
| 64 |
-
transform: scale(0);
|
| 65 |
-
}
|
| 66 |
-
|
| 67 |
-
40% {
|
| 68 |
-
transform: scale(1);
|
| 69 |
-
}
|
| 70 |
}
|
| 71 |
|
| 72 |
-
|
| 73 |
-
#currentStreamingMsg p {
|
| 74 |
-
animation: fadeIn 0.3s ease-in;
|
| 75 |
-
}
|
| 76 |
-
|
| 77 |
-
@keyframes fadeIn {
|
| 78 |
-
from {
|
| 79 |
-
opacity: 0.7;
|
| 80 |
-
}
|
| 81 |
-
|
| 82 |
-
to {
|
| 83 |
-
opacity: 1;
|
| 84 |
-
}
|
| 85 |
-
}
|
| 86 |
-
|
| 87 |
-
/* Ensure the bubble has a minimum height so it doesn't look like a "small block" */
|
| 88 |
-
.msg-bubble:empty::before {
|
| 89 |
-
content: "Writing wisdom...";
|
| 90 |
-
color: var(--muted);
|
| 91 |
-
font-style: italic;
|
| 92 |
-
font-size: 0.9rem;
|
| 93 |
-
}
|
| 94 |
-
|
| 95 |
-
html,
|
| 96 |
-
body {
|
| 97 |
height: 100%;
|
| 98 |
background: var(--bg);
|
| 99 |
color: var(--cream);
|
|
@@ -103,15 +42,14 @@
|
|
| 103 |
overflow: hidden;
|
| 104 |
}
|
| 105 |
|
| 106 |
-
/* ββ Background texture βββββββββββββββββββββββββββββββββββββ */
|
| 107 |
body::before {
|
| 108 |
content: '';
|
| 109 |
position: fixed;
|
| 110 |
inset: 0;
|
| 111 |
background:
|
| 112 |
-
radial-gradient(ellipse 80% 60% at 20% 10%, rgba(201,
|
| 113 |
-
radial-gradient(ellipse 60% 80% at 80% 90%, rgba(91,
|
| 114 |
-
radial-gradient(ellipse 50% 50% at 50% 50%, rgba(176,
|
| 115 |
url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' width='400' height='400'%3E%3Cfilter id='n'%3E%3CfeTurbulence type='fractalNoise' baseFrequency='0.75' numOctaves='4' stitchTiles='stitch'/%3E%3CfeColorMatrix type='saturate' values='0'/%3E%3C/filter%3E%3Crect width='400' height='400' filter='url(%23n)' opacity='0.04'/%3E%3C/svg%3E");
|
| 116 |
pointer-events: none;
|
| 117 |
z-index: 0;
|
|
@@ -131,106 +69,121 @@
|
|
| 131 |
|
| 132 |
/* ββ Header βββββββββββββββββββββββββββββββββββββββββββββββββ */
|
| 133 |
header {
|
| 134 |
-
padding:
|
| 135 |
text-align: center;
|
| 136 |
border-bottom: 1px solid var(--border);
|
|
|
|
| 137 |
}
|
| 138 |
|
| 139 |
.mandala {
|
| 140 |
-
font-size:
|
| 141 |
letter-spacing: .5rem;
|
| 142 |
color: var(--gold);
|
| 143 |
opacity: .6;
|
| 144 |
-
margin-bottom:
|
| 145 |
animation: spin 60s linear infinite;
|
| 146 |
display: inline-block;
|
| 147 |
}
|
| 148 |
-
|
| 149 |
-
@keyframes spin {
|
| 150 |
-
to {
|
| 151 |
-
transform: rotate(360deg);
|
| 152 |
-
}
|
| 153 |
-
}
|
| 154 |
|
| 155 |
h1 {
|
| 156 |
font-family: 'Cinzel Decorative', serif;
|
| 157 |
-
font-size: clamp(1.
|
| 158 |
font-weight: 400;
|
| 159 |
color: var(--gold-pale);
|
| 160 |
letter-spacing: .12em;
|
| 161 |
-
text-shadow: 0 0 40px rgba(201,
|
| 162 |
}
|
| 163 |
|
| 164 |
.subtitle {
|
| 165 |
font-family: 'IM Fell English', serif;
|
| 166 |
font-style: italic;
|
| 167 |
-
font-size: .
|
| 168 |
color: var(--muted);
|
| 169 |
-
margin-top:
|
| 170 |
}
|
| 171 |
|
| 172 |
.badges {
|
| 173 |
display: flex;
|
| 174 |
justify-content: center;
|
| 175 |
-
gap:
|
| 176 |
-
margin-top:
|
| 177 |
flex-wrap: wrap;
|
| 178 |
}
|
| 179 |
|
| 180 |
.badge {
|
| 181 |
-
font-size: .
|
| 182 |
letter-spacing: .1em;
|
| 183 |
text-transform: uppercase;
|
| 184 |
-
padding:
|
| 185 |
border-radius: 20px;
|
| 186 |
border: 1px solid;
|
| 187 |
font-family: 'Cormorant Garamond', serif;
|
| 188 |
font-weight: 600;
|
| 189 |
}
|
|
|
|
|
|
|
|
|
|
|
|
|
| 190 |
|
| 191 |
-
|
| 192 |
-
|
| 193 |
-
|
| 194 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 195 |
}
|
| 196 |
|
| 197 |
-
.
|
| 198 |
-
|
| 199 |
-
|
| 200 |
-
|
|
|
|
| 201 |
}
|
| 202 |
|
| 203 |
-
.
|
| 204 |
-
color: var(--
|
| 205 |
-
|
| 206 |
-
background: rgba(91, 140, 224, .1);
|
| 207 |
}
|
| 208 |
|
| 209 |
-
.
|
| 210 |
-
|
| 211 |
-
|
| 212 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 213 |
}
|
| 214 |
|
| 215 |
/* ββ Chat Window ββββββββββββββββββββββββββββββββββββββββββββ */
|
| 216 |
.chat-window {
|
| 217 |
overflow-y: auto;
|
| 218 |
-
padding:
|
| 219 |
display: flex;
|
| 220 |
flex-direction: column;
|
| 221 |
gap: 24px;
|
| 222 |
scrollbar-width: thin;
|
| 223 |
scrollbar-color: var(--border) transparent;
|
| 224 |
}
|
| 225 |
-
|
| 226 |
-
.chat-window::-webkit-scrollbar {
|
| 227 |
-
width: 4px;
|
| 228 |
-
}
|
| 229 |
-
|
| 230 |
-
.chat-window::-webkit-scrollbar-thumb {
|
| 231 |
-
background: var(--border);
|
| 232 |
-
border-radius: 4px;
|
| 233 |
-
}
|
| 234 |
|
| 235 |
/* ββ Welcome State ββββββββββββββββββββββββββββββββββββββββββ */
|
| 236 |
.welcome {
|
|
@@ -239,84 +192,46 @@
|
|
| 239 |
padding: 20px;
|
| 240 |
max-width: 500px;
|
| 241 |
}
|
| 242 |
-
|
| 243 |
-
.welcome-icon {
|
| 244 |
-
font-size: 3.5rem;
|
| 245 |
-
margin-bottom: 16px;
|
| 246 |
-
filter: drop-shadow(0 0 20px rgba(201, 153, 58, .4));
|
| 247 |
-
}
|
| 248 |
-
|
| 249 |
.welcome h2 {
|
| 250 |
font-family: 'IM Fell English', serif;
|
| 251 |
font-style: italic;
|
| 252 |
-
font-size: 1.
|
| 253 |
color: var(--gold-light);
|
| 254 |
-
margin-bottom:
|
| 255 |
-
}
|
| 256 |
-
|
| 257 |
-
.welcome p {
|
| 258 |
-
font-size: .95rem;
|
| 259 |
-
color: var(--muted);
|
| 260 |
-
line-height: 1.8;
|
| 261 |
-
}
|
| 262 |
-
|
| 263 |
-
.suggested-queries {
|
| 264 |
-
margin-top: 24px;
|
| 265 |
-
display: flex;
|
| 266 |
-
flex-direction: column;
|
| 267 |
-
gap: 8px;
|
| 268 |
}
|
|
|
|
| 269 |
|
|
|
|
| 270 |
.suggested-queries button {
|
| 271 |
background: var(--surface);
|
| 272 |
border: 1px solid var(--border);
|
| 273 |
color: var(--cream);
|
| 274 |
-
padding:
|
| 275 |
border-radius: 8px;
|
| 276 |
font-family: 'Cormorant Garamond', serif;
|
| 277 |
-
font-size: .
|
| 278 |
font-style: italic;
|
| 279 |
cursor: pointer;
|
| 280 |
transition: all .2s;
|
| 281 |
text-align: left;
|
| 282 |
}
|
| 283 |
-
|
| 284 |
-
.suggested-queries button:hover {
|
| 285 |
-
border-color: var(--gold);
|
| 286 |
-
color: var(--gold-pale);
|
| 287 |
-
background: var(--surface-2);
|
| 288 |
-
}
|
| 289 |
|
| 290 |
/* ββ Messages βββββββββββββββββββββββββββββββββββββββββββββββ */
|
| 291 |
.message {
|
| 292 |
display: flex;
|
| 293 |
flex-direction: column;
|
| 294 |
-
gap:
|
| 295 |
animation: fadeUp .4s ease both;
|
| 296 |
}
|
|
|
|
| 297 |
|
| 298 |
-
|
| 299 |
-
|
| 300 |
-
opacity: 0;
|
| 301 |
-
transform: translateY(12px);
|
| 302 |
-
}
|
| 303 |
-
|
| 304 |
-
to {
|
| 305 |
-
opacity: 1;
|
| 306 |
-
transform: translateY(0);
|
| 307 |
-
}
|
| 308 |
-
}
|
| 309 |
-
|
| 310 |
-
.message-user {
|
| 311 |
-
align-items: flex-end;
|
| 312 |
-
}
|
| 313 |
-
|
| 314 |
-
.message-assistant {
|
| 315 |
-
align-items: flex-start;
|
| 316 |
-
}
|
| 317 |
|
| 318 |
.msg-label {
|
| 319 |
-
font-size: .
|
| 320 |
letter-spacing: .15em;
|
| 321 |
text-transform: uppercase;
|
| 322 |
color: var(--muted);
|
|
@@ -326,7 +241,7 @@
|
|
| 326 |
|
| 327 |
.msg-bubble {
|
| 328 |
max-width: 92%;
|
| 329 |
-
padding:
|
| 330 |
border-radius: 12px;
|
| 331 |
line-height: 1.75;
|
| 332 |
}
|
|
@@ -336,40 +251,40 @@
|
|
| 336 |
border: 1px solid var(--border);
|
| 337 |
color: var(--cream);
|
| 338 |
font-style: italic;
|
| 339 |
-
font-size:
|
| 340 |
border-bottom-right-radius: 4px;
|
| 341 |
}
|
| 342 |
|
| 343 |
.message-assistant .msg-bubble {
|
| 344 |
-
background: linear-gradient(135deg, var(--surface) 0%, rgba(30,
|
| 345 |
-
border: 1px solid rgba(201,
|
| 346 |
color: var(--cream);
|
| 347 |
-
font-size:
|
| 348 |
border-bottom-left-radius: 4px;
|
| 349 |
-
box-shadow: 0 4px 24px rgba(0,
|
| 350 |
-
}
|
| 351 |
-
|
| 352 |
-
.msg-bubble p {
|
| 353 |
-
margin-bottom: 1em;
|
| 354 |
}
|
| 355 |
|
| 356 |
-
.msg-bubble p
|
| 357 |
-
|
| 358 |
-
}
|
| 359 |
|
| 360 |
-
|
| 361 |
-
|
| 362 |
-
font-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 363 |
}
|
| 364 |
|
| 365 |
/* ββ Sources Panel ββββββββββββββββββββββββββββββββββββββββββ */
|
| 366 |
-
.sources {
|
| 367 |
-
max-width: 92%;
|
| 368 |
-
margin-top: 4px;
|
| 369 |
-
}
|
| 370 |
-
|
| 371 |
.sources-label {
|
| 372 |
-
font-size: .
|
| 373 |
letter-spacing: .12em;
|
| 374 |
text-transform: uppercase;
|
| 375 |
color: var(--muted);
|
|
@@ -378,27 +293,12 @@
|
|
| 378 |
align-items: center;
|
| 379 |
gap: 6px;
|
| 380 |
}
|
|
|
|
|
|
|
| 381 |
|
| 382 |
-
.
|
| 383 |
-
.sources-label::after {
|
| 384 |
-
content: '';
|
| 385 |
-
flex: 1;
|
| 386 |
-
height: 1px;
|
| 387 |
-
background: var(--border);
|
| 388 |
-
}
|
| 389 |
-
|
| 390 |
-
.sources-label::before {
|
| 391 |
-
max-width: 20px;
|
| 392 |
-
}
|
| 393 |
-
|
| 394 |
-
.source-tags {
|
| 395 |
-
display: flex;
|
| 396 |
-
flex-wrap: wrap;
|
| 397 |
-
gap: 6px;
|
| 398 |
-
}
|
| 399 |
-
|
| 400 |
.source-tag {
|
| 401 |
-
font-size: .
|
| 402 |
padding: 4px 10px;
|
| 403 |
border-radius: 6px;
|
| 404 |
border: 1px solid;
|
|
@@ -406,101 +306,55 @@
|
|
| 406 |
cursor: default;
|
| 407 |
transition: all .2s;
|
| 408 |
}
|
|
|
|
| 409 |
|
| 410 |
-
.source-
|
| 411 |
-
|
| 412 |
-
|
| 413 |
-
}
|
| 414 |
-
|
| 415 |
-
.source-gita {
|
| 416 |
-
color: var(--gita);
|
| 417 |
-
border-color: rgba(224, 123, 59, .4);
|
| 418 |
-
background: rgba(224, 123, 59, .08);
|
| 419 |
-
}
|
| 420 |
-
|
| 421 |
-
.source-quran {
|
| 422 |
-
color: var(--quran);
|
| 423 |
-
border-color: rgba(59, 186, 133, .4);
|
| 424 |
-
background: rgba(59, 186, 133, .08);
|
| 425 |
-
}
|
| 426 |
-
|
| 427 |
-
.source-bible {
|
| 428 |
-
color: var(--bible);
|
| 429 |
-
border-color: rgba(91, 140, 224, .4);
|
| 430 |
-
background: rgba(91, 140, 224, .08);
|
| 431 |
-
}
|
| 432 |
-
|
| 433 |
-
.source-granth {
|
| 434 |
-
color: var(--granth);
|
| 435 |
-
border-color: rgba(176, 124, 224, .4);
|
| 436 |
-
background: rgba(176, 124, 224, .08);
|
| 437 |
-
}
|
| 438 |
-
|
| 439 |
-
.source-other {
|
| 440 |
-
color: var(--gold-light);
|
| 441 |
-
border-color: rgba(201, 153, 58, .4);
|
| 442 |
-
background: rgba(201, 153, 58, .08);
|
| 443 |
-
}
|
| 444 |
|
| 445 |
/* ββ Loading ββββββββββββββββββββββββββββββββββββββββββββββββ */
|
| 446 |
.loading {
|
| 447 |
display: flex;
|
| 448 |
align-items: center;
|
| 449 |
-
gap:
|
| 450 |
-
padding:
|
| 451 |
-
border: 1px solid rgba(201,
|
| 452 |
border-radius: 12px;
|
| 453 |
background: var(--surface);
|
| 454 |
width: fit-content;
|
| 455 |
max-width: 280px;
|
| 456 |
}
|
| 457 |
-
|
| 458 |
-
.loading-dots {
|
| 459 |
-
display: flex;
|
| 460 |
-
gap: 5px;
|
| 461 |
-
}
|
| 462 |
-
|
| 463 |
.loading-dots span {
|
| 464 |
-
width: 6px;
|
| 465 |
-
height: 6px;
|
| 466 |
border-radius: 50%;
|
| 467 |
background: var(--gold);
|
| 468 |
animation: dot-pulse 1.4s ease-in-out infinite;
|
| 469 |
}
|
| 470 |
-
|
| 471 |
-
.loading-dots span:nth-child(
|
| 472 |
-
animation-delay: .2s;
|
| 473 |
-
}
|
| 474 |
-
|
| 475 |
-
.loading-dots span:nth-child(3) {
|
| 476 |
-
animation-delay: .4s;
|
| 477 |
-
}
|
| 478 |
-
|
| 479 |
@keyframes dot-pulse {
|
| 480 |
-
|
| 481 |
-
|
| 482 |
-
80%,
|
| 483 |
-
100% {
|
| 484 |
-
opacity: .2;
|
| 485 |
-
transform: scale(.8);
|
| 486 |
-
}
|
| 487 |
-
|
| 488 |
-
40% {
|
| 489 |
-
opacity: 1;
|
| 490 |
-
transform: scale(1.1);
|
| 491 |
-
}
|
| 492 |
}
|
|
|
|
| 493 |
|
| 494 |
-
|
| 495 |
-
|
| 496 |
-
|
| 497 |
-
|
|
|
|
|
|
|
|
|
|
| 498 |
}
|
|
|
|
| 499 |
|
| 500 |
/* ββ Error ββββββββββββββββββββββββββββββββββββββββββββββββββ */
|
| 501 |
.error-bubble {
|
| 502 |
-
background: rgba(180,
|
| 503 |
-
border: 1px solid rgba(180,
|
| 504 |
color: #e08080;
|
| 505 |
padding: 12px 16px;
|
| 506 |
border-radius: 10px;
|
|
@@ -509,52 +363,38 @@
|
|
| 509 |
}
|
| 510 |
|
| 511 |
/* ββ Input Area βββββββββββββββββββββββββββββββββββββββββββββ */
|
| 512 |
-
.input-area {
|
| 513 |
-
|
| 514 |
-
border-top: 1px solid var(--border);
|
| 515 |
-
}
|
| 516 |
-
|
| 517 |
-
.input-row {
|
| 518 |
-
display: flex;
|
| 519 |
-
gap: 10px;
|
| 520 |
-
align-items: flex-end;
|
| 521 |
-
}
|
| 522 |
|
| 523 |
textarea {
|
| 524 |
flex: 1;
|
| 525 |
background: var(--surface);
|
| 526 |
border: 1px solid var(--border);
|
| 527 |
color: var(--cream);
|
| 528 |
-
padding:
|
| 529 |
border-radius: 12px;
|
| 530 |
font-family: 'Cormorant Garamond', serif;
|
| 531 |
-
font-size:
|
| 532 |
line-height: 1.6;
|
| 533 |
resize: none;
|
| 534 |
-
min-height:
|
| 535 |
-
max-height:
|
| 536 |
outline: none;
|
| 537 |
transition: border-color .2s, box-shadow .2s;
|
| 538 |
}
|
| 539 |
-
|
| 540 |
-
textarea::placeholder {
|
| 541 |
-
color: var(--muted);
|
| 542 |
-
font-style: italic;
|
| 543 |
-
}
|
| 544 |
-
|
| 545 |
textarea:focus {
|
| 546 |
-
border-color: rgba(201,
|
| 547 |
-
box-shadow: 0 0 0 3px rgba(201,
|
| 548 |
}
|
| 549 |
|
| 550 |
.send-btn {
|
| 551 |
-
width:
|
| 552 |
-
height: 52px;
|
| 553 |
border-radius: 12px;
|
| 554 |
-
border: 1px solid rgba(201,
|
| 555 |
-
background: linear-gradient(135deg, rgba(201,
|
| 556 |
color: var(--gold);
|
| 557 |
-
font-size: 1.
|
| 558 |
cursor: pointer;
|
| 559 |
transition: all .2s;
|
| 560 |
display: flex;
|
|
@@ -562,36 +402,15 @@
|
|
| 562 |
justify-content: center;
|
| 563 |
flex-shrink: 0;
|
| 564 |
}
|
| 565 |
-
|
| 566 |
.send-btn:hover:not(:disabled) {
|
| 567 |
-
background: linear-gradient(135deg, rgba(201,
|
| 568 |
border-color: var(--gold);
|
| 569 |
transform: translateY(-1px);
|
| 570 |
-
box-shadow: 0 4px 16px rgba(201,
|
| 571 |
}
|
|
|
|
| 572 |
|
| 573 |
-
.
|
| 574 |
-
opacity: .3;
|
| 575 |
-
cursor: not-allowed;
|
| 576 |
-
transform: none;
|
| 577 |
-
}
|
| 578 |
-
|
| 579 |
-
.input-hint {
|
| 580 |
-
font-size: .72rem;
|
| 581 |
-
color: var(--muted);
|
| 582 |
-
margin-top: 8px;
|
| 583 |
-
text-align: center;
|
| 584 |
-
font-style: italic;
|
| 585 |
-
}
|
| 586 |
-
|
| 587 |
-
/* ββ Divider line βββββββββββββββββββββββββββββββββββββββββββ */
|
| 588 |
-
.ornament {
|
| 589 |
-
text-align: center;
|
| 590 |
-
color: var(--border);
|
| 591 |
-
font-size: .8rem;
|
| 592 |
-
letter-spacing: .4em;
|
| 593 |
-
margin: 4px 0;
|
| 594 |
-
}
|
| 595 |
</style>
|
| 596 |
</head>
|
| 597 |
|
|
@@ -609,6 +428,16 @@
|
|
| 609 |
<span class="badge badge-bible">Bible</span>
|
| 610 |
<span class="badge badge-granth">Guru Granth Sahib</span>
|
| 611 |
</div>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 612 |
</header>
|
| 613 |
|
| 614 |
<!-- Chat Window -->
|
|
@@ -616,15 +445,18 @@
|
|
| 616 |
<div class="welcome" id="welcomePane">
|
| 617 |
<div class="welcome-icon">ποΈ</div>
|
| 618 |
<h2>"Seek, and it shall be given unto you"</h2>
|
| 619 |
-
<p>Ask any spiritual or philosophical question. Answers are drawn exclusively from the
|
| 620 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 621 |
<div class="suggested-queries">
|
| 622 |
<button onclick="askSuggested(this)">What do the scriptures say about forgiveness?</button>
|
| 623 |
<button onclick="askSuggested(this)">How should one face fear and death?</button>
|
| 624 |
<button onclick="askSuggested(this)">What is the purpose of prayer and worship?</button>
|
| 625 |
<button onclick="askSuggested(this)">What is the nature of the soul according to each religion?</button>
|
| 626 |
-
<button onclick="askSuggested(this)">What do the scriptures teach about humility and selfless
|
| 627 |
-
service?</button>
|
| 628 |
</div>
|
| 629 |
</div>
|
| 630 |
</div>
|
|
@@ -632,26 +464,88 @@
|
|
| 632 |
<!-- Input -->
|
| 633 |
<div class="input-area">
|
| 634 |
<div class="input-row">
|
| 635 |
-
<textarea id="questionInput"
|
| 636 |
-
|
| 637 |
-
|
| 638 |
-
|
| 639 |
-
|
|
|
|
| 640 |
</div>
|
| 641 |
-
<p class="input-hint">
|
| 642 |
-
texts</p>
|
| 643 |
</div>
|
| 644 |
|
| 645 |
</div>
|
| 646 |
|
| 647 |
<script>
|
| 648 |
const API_BASE = window.location.origin;
|
| 649 |
-
let isLoading
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 650 |
|
| 651 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 652 |
function getSourceClass(book) {
|
| 653 |
const b = book.toLowerCase();
|
| 654 |
-
if (b.includes("gita"))
|
| 655 |
if (b.includes("quran") || b.includes("koran")) return "source-quran";
|
| 656 |
if (b.includes("bible") || b.includes("testament")) return "source-bible";
|
| 657 |
if (b.includes("granth") || b.includes("guru")) return "source-granth";
|
|
@@ -670,23 +564,28 @@
|
|
| 670 |
|
| 671 |
function autoResize(el) {
|
| 672 |
el.style.height = "auto";
|
| 673 |
-
el.style.height = Math.min(el.scrollHeight,
|
| 674 |
}
|
| 675 |
|
| 676 |
function formatAnswer(text) {
|
| 677 |
-
// Convert markdown-ish bold (**text**) to <strong>
|
| 678 |
text = text.replace(/\*\*(.*?)\*\*/g, "<strong>$1</strong>");
|
| 679 |
-
// Wrap paragraphs
|
| 680 |
return text.split(/\n\n+/).filter(p => p.trim()).map(p => `<p>${p.trim()}</p>`).join("");
|
| 681 |
}
|
| 682 |
|
| 683 |
-
|
| 684 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 685 |
const w = document.getElementById("chatWindow");
|
| 686 |
const div = document.createElement("div");
|
| 687 |
div.className = "message message-user";
|
|
|
|
|
|
|
|
|
|
| 688 |
div.innerHTML = `
|
| 689 |
-
<span class="msg-label">You</span>
|
| 690 |
<div class="msg-bubble">${escapeHtml(question)}</div>
|
| 691 |
`;
|
| 692 |
w.appendChild(div);
|
|
@@ -710,63 +609,46 @@
|
|
| 710 |
return div;
|
| 711 |
}
|
| 712 |
|
| 713 |
-
function
|
| 714 |
-
const
|
| 715 |
-
|
| 716 |
-
// Build source tags
|
| 717 |
-
const sourceTags = (data.sources || []).map(s => {
|
| 718 |
const cls = getSourceClass(s.book);
|
| 719 |
-
return `<span class="source-tag ${cls}" title="
|
| 720 |
}).join("");
|
| 721 |
-
|
| 722 |
-
|
| 723 |
-
|
| 724 |
-
|
| 725 |
-
|
| 726 |
-
|
| 727 |
-
|
| 728 |
-
|
| 729 |
-
loadingEl.innerHTML = `
|
| 730 |
-
<span class="msg-label">Sacred Texts</span>
|
| 731 |
-
<div class="msg-bubble">${formatAnswer(data.answer)}</div>
|
| 732 |
-
${sourcesHtml}
|
| 733 |
-
`;
|
| 734 |
-
scrollToBottom();
|
| 735 |
-
}
|
| 736 |
-
|
| 737 |
-
function replaceLoadingWithError(loadingEl, msg) {
|
| 738 |
-
loadingEl.innerHTML = `
|
| 739 |
-
<span class="msg-label">Error</span>
|
| 740 |
-
<div class="error-bubble">β οΈ ${escapeHtml(msg)}</div>
|
| 741 |
-
`;
|
| 742 |
-
scrollToBottom();
|
| 743 |
-
}
|
| 744 |
-
|
| 745 |
-
function escapeHtml(str) {
|
| 746 |
-
return str.replace(/&/g, "&").replace(/</g, "<").replace(/>/g, ">");
|
| 747 |
}
|
| 748 |
|
| 749 |
-
// ββ
|
| 750 |
async function sendQuestion() {
|
| 751 |
if (isLoading) return;
|
| 752 |
-
const input
|
| 753 |
const question = input.value.trim();
|
| 754 |
if (!question) return;
|
| 755 |
|
| 756 |
hideWelcome();
|
|
|
|
|
|
|
| 757 |
isLoading = true;
|
| 758 |
document.getElementById("sendBtn").disabled = true;
|
| 759 |
input.value = "";
|
| 760 |
input.style.height = "auto";
|
| 761 |
|
| 762 |
-
appendUserMessage(question);
|
| 763 |
const loadingEl = appendLoading();
|
| 764 |
|
| 765 |
try {
|
|
|
|
|
|
|
|
|
|
| 766 |
const res = await fetch(`${API_BASE}/ask`, {
|
| 767 |
-
method:
|
| 768 |
headers: { "Content-Type": "application/json" },
|
| 769 |
-
body:
|
| 770 |
});
|
| 771 |
|
| 772 |
if (!res.ok) {
|
|
@@ -774,36 +656,36 @@
|
|
| 774 |
throw new Error(err.detail || "Server error");
|
| 775 |
}
|
| 776 |
|
| 777 |
-
//
|
| 778 |
-
const
|
| 779 |
-
|
| 780 |
-
let fullAnswer = "";
|
| 781 |
-
let buffer = "";
|
| 782 |
|
| 783 |
-
//
|
| 784 |
loadingEl.innerHTML = `
|
| 785 |
-
|
| 786 |
-
|
| 787 |
-
|
| 788 |
-
|
| 789 |
-
|
| 790 |
-
|
| 791 |
-
|
|
|
|
|
|
|
| 792 |
const sourcesContainer = document.getElementById("currentStreamingSources");
|
| 793 |
-
let
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 794 |
|
| 795 |
while (true) {
|
| 796 |
const { done, value } = await reader.read();
|
| 797 |
if (done) break;
|
| 798 |
|
| 799 |
-
// Append new data to the buffer
|
| 800 |
buffer += decoder.decode(value, { stream: true });
|
| 801 |
-
|
| 802 |
-
// Split by newline
|
| 803 |
const lines = buffer.split("\n");
|
| 804 |
-
|
| 805 |
-
buffer = lines.pop();
|
| 806 |
-
|
| 807 |
|
| 808 |
for (const line of lines) {
|
| 809 |
if (!line.trim()) continue;
|
|
@@ -811,20 +693,13 @@
|
|
| 811 |
const parsed = JSON.parse(line);
|
| 812 |
|
| 813 |
if (parsed.type === "token") {
|
| 814 |
-
|
| 815 |
-
if (!firstTokenReceived) {
|
| 816 |
-
bubble.innerHTML = "";
|
| 817 |
-
firstTokenReceived = true;
|
| 818 |
-
}
|
| 819 |
-
|
| 820 |
fullAnswer += parsed.data;
|
| 821 |
-
// Dynamically update the bubble with formatted markdown/paragraphs
|
| 822 |
bubble.innerHTML = formatAnswer(fullAnswer);
|
| 823 |
scrollToBottom();
|
| 824 |
}
|
| 825 |
else if (parsed.type === "sources") {
|
| 826 |
-
|
| 827 |
-
renderSourcesInPlace(sourcesContainer, sourcesData);
|
| 828 |
}
|
| 829 |
else if (parsed.type === "cache") {
|
| 830 |
bubble.innerHTML = formatAnswer(parsed.data.answer);
|
|
@@ -832,18 +707,24 @@
|
|
| 832 |
scrollToBottom();
|
| 833 |
}
|
| 834 |
} catch (e) {
|
| 835 |
-
console.
|
| 836 |
}
|
| 837 |
}
|
| 838 |
}
|
| 839 |
|
| 840 |
-
//
|
|
|
|
|
|
|
|
|
|
|
|
|
| 841 |
bubble.removeAttribute("id");
|
| 842 |
sourcesContainer.removeAttribute("id");
|
| 843 |
|
| 844 |
} catch (err) {
|
| 845 |
-
|
| 846 |
-
|
|
|
|
|
|
|
| 847 |
} finally {
|
| 848 |
isLoading = false;
|
| 849 |
document.getElementById("sendBtn").disabled = false;
|
|
@@ -851,27 +732,9 @@
|
|
| 851 |
}
|
| 852 |
}
|
| 853 |
|
| 854 |
-
// Helper to render sources inside the streaming flow
|
| 855 |
-
function renderSourcesInPlace(container, sources) {
|
| 856 |
-
const sourceTags = (sources || []).map(s => {
|
| 857 |
-
const cls = getSourceClass(s.book);
|
| 858 |
-
// Use verse citations as the primary text
|
| 859 |
-
return `<span class="source-tag ${cls}" title="${s.snippet}">π ${s.book}</span>`;
|
| 860 |
-
}).join("");
|
| 861 |
-
|
| 862 |
-
if (sourceTags) {
|
| 863 |
-
container.innerHTML = `
|
| 864 |
-
<div class="sources">
|
| 865 |
-
<div class="sources-label">Citations</div>
|
| 866 |
-
<div class="source-tags">${sourceTags}</div>
|
| 867 |
-
</div>
|
| 868 |
-
`;
|
| 869 |
-
}
|
| 870 |
-
}
|
| 871 |
-
|
| 872 |
function askSuggested(btn) {
|
| 873 |
const input = document.getElementById("questionInput");
|
| 874 |
-
input.value = btn.textContent;
|
| 875 |
autoResize(input);
|
| 876 |
sendQuestion();
|
| 877 |
}
|
|
@@ -882,7 +745,9 @@
|
|
| 882 |
sendQuestion();
|
| 883 |
}
|
| 884 |
}
|
|
|
|
|
|
|
|
|
|
| 885 |
</script>
|
| 886 |
</body>
|
| 887 |
-
|
| 888 |
</html>
|
|
|
|
| 13 |
|
| 14 |
<style>
|
| 15 |
/* ββ Reset & Base βββββββββββββββββββββββββββββββββββββββββββ */
|
| 16 |
+
*, *::before, *::after { box-sizing: border-box; margin: 0; padding: 0; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 17 |
|
| 18 |
:root {
|
| 19 |
--bg: #0d0b07;
|
|
|
|
| 26 |
--cream: #f0e6cc;
|
| 27 |
--muted: #7a6a4a;
|
| 28 |
--gita: #e07b3b;
|
|
|
|
| 29 |
--quran: #3bba85;
|
|
|
|
| 30 |
--bible: #5b8ce0;
|
|
|
|
| 31 |
--granth: #b07ce0;
|
| 32 |
+
--danger: #e06060;
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
}
|
| 34 |
|
| 35 |
+
html, body {
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 36 |
height: 100%;
|
| 37 |
background: var(--bg);
|
| 38 |
color: var(--cream);
|
|
|
|
| 42 |
overflow: hidden;
|
| 43 |
}
|
| 44 |
|
|
|
|
| 45 |
body::before {
|
| 46 |
content: '';
|
| 47 |
position: fixed;
|
| 48 |
inset: 0;
|
| 49 |
background:
|
| 50 |
+
radial-gradient(ellipse 80% 60% at 20% 10%, rgba(201,153,58,.07) 0%, transparent 60%),
|
| 51 |
+
radial-gradient(ellipse 60% 80% at 80% 90%, rgba(91,140,224,.05) 0%, transparent 60%),
|
| 52 |
+
radial-gradient(ellipse 50% 50% at 50% 50%, rgba(176,124,224,.04) 0%, transparent 60%),
|
| 53 |
url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' width='400' height='400'%3E%3Cfilter id='n'%3E%3CfeTurbulence type='fractalNoise' baseFrequency='0.75' numOctaves='4' stitchTiles='stitch'/%3E%3CfeColorMatrix type='saturate' values='0'/%3E%3C/filter%3E%3Crect width='400' height='400' filter='url(%23n)' opacity='0.04'/%3E%3C/svg%3E");
|
| 54 |
pointer-events: none;
|
| 55 |
z-index: 0;
|
|
|
|
| 69 |
|
| 70 |
/* ββ Header βββββββββββββββββββββββββββββββββββββββββββββββββ */
|
| 71 |
header {
|
| 72 |
+
padding: 20px 0 14px;
|
| 73 |
text-align: center;
|
| 74 |
border-bottom: 1px solid var(--border);
|
| 75 |
+
position: relative;
|
| 76 |
}
|
| 77 |
|
| 78 |
.mandala {
|
| 79 |
+
font-size: 1.8rem;
|
| 80 |
letter-spacing: .5rem;
|
| 81 |
color: var(--gold);
|
| 82 |
opacity: .6;
|
| 83 |
+
margin-bottom: 6px;
|
| 84 |
animation: spin 60s linear infinite;
|
| 85 |
display: inline-block;
|
| 86 |
}
|
| 87 |
+
@keyframes spin { to { transform: rotate(360deg); } }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 88 |
|
| 89 |
h1 {
|
| 90 |
font-family: 'Cinzel Decorative', serif;
|
| 91 |
+
font-size: clamp(1.1rem, 3vw, 1.7rem);
|
| 92 |
font-weight: 400;
|
| 93 |
color: var(--gold-pale);
|
| 94 |
letter-spacing: .12em;
|
| 95 |
+
text-shadow: 0 0 40px rgba(201,153,58,.3);
|
| 96 |
}
|
| 97 |
|
| 98 |
.subtitle {
|
| 99 |
font-family: 'IM Fell English', serif;
|
| 100 |
font-style: italic;
|
| 101 |
+
font-size: .9rem;
|
| 102 |
color: var(--muted);
|
| 103 |
+
margin-top: 3px;
|
| 104 |
}
|
| 105 |
|
| 106 |
.badges {
|
| 107 |
display: flex;
|
| 108 |
justify-content: center;
|
| 109 |
+
gap: 10px;
|
| 110 |
+
margin-top: 10px;
|
| 111 |
flex-wrap: wrap;
|
| 112 |
}
|
| 113 |
|
| 114 |
.badge {
|
| 115 |
+
font-size: .7rem;
|
| 116 |
letter-spacing: .1em;
|
| 117 |
text-transform: uppercase;
|
| 118 |
+
padding: 2px 9px;
|
| 119 |
border-radius: 20px;
|
| 120 |
border: 1px solid;
|
| 121 |
font-family: 'Cormorant Garamond', serif;
|
| 122 |
font-weight: 600;
|
| 123 |
}
|
| 124 |
+
.badge-gita { color: var(--gita); border-color: var(--gita); background: rgba(224,123,59,.1); }
|
| 125 |
+
.badge-quran { color: var(--quran); border-color: var(--quran); background: rgba(59,186,133,.1); }
|
| 126 |
+
.badge-bible { color: var(--bible); border-color: var(--bible); background: rgba(91,140,224,.1); }
|
| 127 |
+
.badge-granth { color: var(--granth); border-color: var(--granth); background: rgba(176,124,224,.1); }
|
| 128 |
|
| 129 |
+
/* ββ Session bar ββββββββββββββββββββββββββββββββββββββββββββ */
|
| 130 |
+
.session-bar {
|
| 131 |
+
display: none; /* hidden until a conversation starts */
|
| 132 |
+
align-items: center;
|
| 133 |
+
justify-content: space-between;
|
| 134 |
+
gap: 8px;
|
| 135 |
+
margin-top: 10px;
|
| 136 |
+
padding: 5px 10px;
|
| 137 |
+
border: 1px solid var(--border);
|
| 138 |
+
border-radius: 8px;
|
| 139 |
+
background: var(--surface);
|
| 140 |
+
font-size: .75rem;
|
| 141 |
+
color: var(--muted);
|
| 142 |
}
|
| 143 |
|
| 144 |
+
.session-bar.visible { display: flex; }
|
| 145 |
+
|
| 146 |
+
.session-turn-count {
|
| 147 |
+
font-family: 'Cormorant Garamond', serif;
|
| 148 |
+
font-style: italic;
|
| 149 |
}
|
| 150 |
|
| 151 |
+
.session-turn-count span {
|
| 152 |
+
color: var(--gold-light);
|
| 153 |
+
font-weight: 600;
|
|
|
|
| 154 |
}
|
| 155 |
|
| 156 |
+
.new-convo-btn {
|
| 157 |
+
display: flex;
|
| 158 |
+
align-items: center;
|
| 159 |
+
gap: 5px;
|
| 160 |
+
background: none;
|
| 161 |
+
border: 1px solid var(--border);
|
| 162 |
+
color: var(--muted);
|
| 163 |
+
padding: 3px 10px;
|
| 164 |
+
border-radius: 6px;
|
| 165 |
+
font-family: 'Cormorant Garamond', serif;
|
| 166 |
+
font-size: .75rem;
|
| 167 |
+
cursor: pointer;
|
| 168 |
+
transition: all .2s;
|
| 169 |
+
}
|
| 170 |
+
.new-convo-btn:hover {
|
| 171 |
+
border-color: var(--danger);
|
| 172 |
+
color: var(--danger);
|
| 173 |
}
|
| 174 |
|
| 175 |
/* ββ Chat Window ββββββββββββββββββββββββββββββββββββββββββββ */
|
| 176 |
.chat-window {
|
| 177 |
overflow-y: auto;
|
| 178 |
+
padding: 24px 0;
|
| 179 |
display: flex;
|
| 180 |
flex-direction: column;
|
| 181 |
gap: 24px;
|
| 182 |
scrollbar-width: thin;
|
| 183 |
scrollbar-color: var(--border) transparent;
|
| 184 |
}
|
| 185 |
+
.chat-window::-webkit-scrollbar { width: 4px; }
|
| 186 |
+
.chat-window::-webkit-scrollbar-thumb { background: var(--border); border-radius: 4px; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 187 |
|
| 188 |
/* ββ Welcome State ββββββββββββββββββββββββββββββββββββββββββ */
|
| 189 |
.welcome {
|
|
|
|
| 192 |
padding: 20px;
|
| 193 |
max-width: 500px;
|
| 194 |
}
|
| 195 |
+
.welcome-icon { font-size: 3.2rem; margin-bottom: 14px; filter: drop-shadow(0 0 20px rgba(201,153,58,.4)); }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 196 |
.welcome h2 {
|
| 197 |
font-family: 'IM Fell English', serif;
|
| 198 |
font-style: italic;
|
| 199 |
+
font-size: 1.4rem;
|
| 200 |
color: var(--gold-light);
|
| 201 |
+
margin-bottom: 8px;
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 202 |
}
|
| 203 |
+
.welcome p { font-size: .92rem; color: var(--muted); line-height: 1.8; }
|
| 204 |
|
| 205 |
+
.suggested-queries { margin-top: 20px; display: flex; flex-direction: column; gap: 7px; }
|
| 206 |
.suggested-queries button {
|
| 207 |
background: var(--surface);
|
| 208 |
border: 1px solid var(--border);
|
| 209 |
color: var(--cream);
|
| 210 |
+
padding: 9px 14px;
|
| 211 |
border-radius: 8px;
|
| 212 |
font-family: 'Cormorant Garamond', serif;
|
| 213 |
+
font-size: .92rem;
|
| 214 |
font-style: italic;
|
| 215 |
cursor: pointer;
|
| 216 |
transition: all .2s;
|
| 217 |
text-align: left;
|
| 218 |
}
|
| 219 |
+
.suggested-queries button:hover { border-color: var(--gold); color: var(--gold-pale); background: var(--surface-2); }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 220 |
|
| 221 |
/* ββ Messages βββββββββββββββββββββββββββββββββββββββββββββββ */
|
| 222 |
.message {
|
| 223 |
display: flex;
|
| 224 |
flex-direction: column;
|
| 225 |
+
gap: 6px;
|
| 226 |
animation: fadeUp .4s ease both;
|
| 227 |
}
|
| 228 |
+
@keyframes fadeUp { from { opacity: 0; transform: translateY(10px); } to { opacity: 1; transform: translateY(0); } }
|
| 229 |
|
| 230 |
+
.message-user { align-items: flex-end; }
|
| 231 |
+
.message-assistant { align-items: flex-start; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 232 |
|
| 233 |
.msg-label {
|
| 234 |
+
font-size: .68rem;
|
| 235 |
letter-spacing: .15em;
|
| 236 |
text-transform: uppercase;
|
| 237 |
color: var(--muted);
|
|
|
|
| 241 |
|
| 242 |
.msg-bubble {
|
| 243 |
max-width: 92%;
|
| 244 |
+
padding: 14px 18px;
|
| 245 |
border-radius: 12px;
|
| 246 |
line-height: 1.75;
|
| 247 |
}
|
|
|
|
| 251 |
border: 1px solid var(--border);
|
| 252 |
color: var(--cream);
|
| 253 |
font-style: italic;
|
| 254 |
+
font-size: .97rem;
|
| 255 |
border-bottom-right-radius: 4px;
|
| 256 |
}
|
| 257 |
|
| 258 |
.message-assistant .msg-bubble {
|
| 259 |
+
background: linear-gradient(135deg, var(--surface) 0%, rgba(30,26,17,.95) 100%);
|
| 260 |
+
border: 1px solid rgba(201,153,58,.2);
|
| 261 |
color: var(--cream);
|
| 262 |
+
font-size: .97rem;
|
| 263 |
border-bottom-left-radius: 4px;
|
| 264 |
+
box-shadow: 0 4px 24px rgba(0,0,0,.4), inset 0 1px 0 rgba(201,153,58,.1);
|
|
|
|
|
|
|
|
|
|
|
|
|
| 265 |
}
|
| 266 |
|
| 267 |
+
.msg-bubble p { margin-bottom: 1em; }
|
| 268 |
+
.msg-bubble p:last-child { margin-bottom: 0; }
|
| 269 |
+
.msg-bubble strong { color: var(--gold-light); font-weight: 600; }
|
| 270 |
|
| 271 |
+
/* Follow-up continuation pill */
|
| 272 |
+
.followup-pill {
|
| 273 |
+
font-size: .68rem;
|
| 274 |
+
padding: 2px 8px;
|
| 275 |
+
border-radius: 10px;
|
| 276 |
+
background: rgba(201,153,58,.08);
|
| 277 |
+
border: 1px solid rgba(201,153,58,.2);
|
| 278 |
+
color: var(--muted);
|
| 279 |
+
margin-left: 6px;
|
| 280 |
+
font-style: italic;
|
| 281 |
+
vertical-align: middle;
|
| 282 |
}
|
| 283 |
|
| 284 |
/* ββ Sources Panel ββββββββββββββββββββββββββββββββββββββββββ */
|
| 285 |
+
.sources { max-width: 92%; margin-top: 4px; }
|
|
|
|
|
|
|
|
|
|
|
|
|
| 286 |
.sources-label {
|
| 287 |
+
font-size: .7rem;
|
| 288 |
letter-spacing: .12em;
|
| 289 |
text-transform: uppercase;
|
| 290 |
color: var(--muted);
|
|
|
|
| 293 |
align-items: center;
|
| 294 |
gap: 6px;
|
| 295 |
}
|
| 296 |
+
.sources-label::before, .sources-label::after { content: ''; flex: 1; height: 1px; background: var(--border); }
|
| 297 |
+
.sources-label::before { max-width: 20px; }
|
| 298 |
|
| 299 |
+
.source-tags { display: flex; flex-wrap: wrap; gap: 6px; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 300 |
.source-tag {
|
| 301 |
+
font-size: .76rem;
|
| 302 |
padding: 4px 10px;
|
| 303 |
border-radius: 6px;
|
| 304 |
border: 1px solid;
|
|
|
|
| 306 |
cursor: default;
|
| 307 |
transition: all .2s;
|
| 308 |
}
|
| 309 |
+
.source-tag:hover { transform: translateY(-1px); filter: brightness(1.2); }
|
| 310 |
|
| 311 |
+
.source-gita { color: var(--gita); border-color: rgba(224,123,59,.4); background: rgba(224,123,59,.08); }
|
| 312 |
+
.source-quran { color: var(--quran); border-color: rgba(59,186,133,.4); background: rgba(59,186,133,.08); }
|
| 313 |
+
.source-bible { color: var(--bible); border-color: rgba(91,140,224,.4); background: rgba(91,140,224,.08); }
|
| 314 |
+
.source-granth { color: var(--granth); border-color: rgba(176,124,224,.4); background: rgba(176,124,224,.08); }
|
| 315 |
+
.source-other { color: var(--gold-light); border-color: rgba(201,153,58,.4); background: rgba(201,153,58,.08); }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 316 |
|
| 317 |
/* ββ Loading ββββββββββββββββββββββββββββββββββββββββββββββββ */
|
| 318 |
.loading {
|
| 319 |
display: flex;
|
| 320 |
align-items: center;
|
| 321 |
+
gap: 10px;
|
| 322 |
+
padding: 12px 16px;
|
| 323 |
+
border: 1px solid rgba(201,153,58,.15);
|
| 324 |
border-radius: 12px;
|
| 325 |
background: var(--surface);
|
| 326 |
width: fit-content;
|
| 327 |
max-width: 280px;
|
| 328 |
}
|
| 329 |
+
.loading-dots { display: flex; gap: 5px; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 330 |
.loading-dots span {
|
| 331 |
+
width: 6px; height: 6px;
|
|
|
|
| 332 |
border-radius: 50%;
|
| 333 |
background: var(--gold);
|
| 334 |
animation: dot-pulse 1.4s ease-in-out infinite;
|
| 335 |
}
|
| 336 |
+
.loading-dots span:nth-child(2) { animation-delay: .2s; }
|
| 337 |
+
.loading-dots span:nth-child(3) { animation-delay: .4s; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 338 |
@keyframes dot-pulse {
|
| 339 |
+
0%,80%,100% { opacity: .2; transform: scale(.8); }
|
| 340 |
+
40% { opacity: 1; transform: scale(1.1); }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 341 |
}
|
| 342 |
+
.loading-text { font-size: .82rem; font-style: italic; color: var(--muted); }
|
| 343 |
|
| 344 |
+
/* ββ Thinking dots (streaming) ββββββββββββββββββββββββββββββ */
|
| 345 |
+
.thinking-dots { display: inline-flex; gap: 4px; margin-left: 4px; }
|
| 346 |
+
.thinking-dots span {
|
| 347 |
+
width: 4px; height: 4px;
|
| 348 |
+
background: var(--gold);
|
| 349 |
+
border-radius: 50%;
|
| 350 |
+
animation: bounce 1.4s infinite ease-in-out;
|
| 351 |
}
|
| 352 |
+
@keyframes bounce { 0%,80%,100% { transform: scale(0); } 40% { transform: scale(1); } }
|
| 353 |
|
| 354 |
/* ββ Error ββββββββββββββββββββββββββββββββββββββββββββββββββ */
|
| 355 |
.error-bubble {
|
| 356 |
+
background: rgba(180,60,60,.1);
|
| 357 |
+
border: 1px solid rgba(180,60,60,.3);
|
| 358 |
color: #e08080;
|
| 359 |
padding: 12px 16px;
|
| 360 |
border-radius: 10px;
|
|
|
|
| 363 |
}
|
| 364 |
|
| 365 |
/* ββ Input Area βββββββββββββββββββββββββββββββββββββββββββββ */
|
| 366 |
+
.input-area { padding: 14px 0 22px; border-top: 1px solid var(--border); }
|
| 367 |
+
.input-row { display: flex; gap: 10px; align-items: flex-end; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 368 |
|
| 369 |
textarea {
|
| 370 |
flex: 1;
|
| 371 |
background: var(--surface);
|
| 372 |
border: 1px solid var(--border);
|
| 373 |
color: var(--cream);
|
| 374 |
+
padding: 13px 15px;
|
| 375 |
border-radius: 12px;
|
| 376 |
font-family: 'Cormorant Garamond', serif;
|
| 377 |
+
font-size: .97rem;
|
| 378 |
line-height: 1.6;
|
| 379 |
resize: none;
|
| 380 |
+
min-height: 50px;
|
| 381 |
+
max-height: 130px;
|
| 382 |
outline: none;
|
| 383 |
transition: border-color .2s, box-shadow .2s;
|
| 384 |
}
|
| 385 |
+
textarea::placeholder { color: var(--muted); font-style: italic; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 386 |
textarea:focus {
|
| 387 |
+
border-color: rgba(201,153,58,.5);
|
| 388 |
+
box-shadow: 0 0 0 3px rgba(201,153,58,.08);
|
| 389 |
}
|
| 390 |
|
| 391 |
.send-btn {
|
| 392 |
+
width: 50px; height: 50px;
|
|
|
|
| 393 |
border-radius: 12px;
|
| 394 |
+
border: 1px solid rgba(201,153,58,.4);
|
| 395 |
+
background: linear-gradient(135deg, rgba(201,153,58,.2), rgba(201,153,58,.05));
|
| 396 |
color: var(--gold);
|
| 397 |
+
font-size: 1.25rem;
|
| 398 |
cursor: pointer;
|
| 399 |
transition: all .2s;
|
| 400 |
display: flex;
|
|
|
|
| 402 |
justify-content: center;
|
| 403 |
flex-shrink: 0;
|
| 404 |
}
|
|
|
|
| 405 |
.send-btn:hover:not(:disabled) {
|
| 406 |
+
background: linear-gradient(135deg, rgba(201,153,58,.35), rgba(201,153,58,.15));
|
| 407 |
border-color: var(--gold);
|
| 408 |
transform: translateY(-1px);
|
| 409 |
+
box-shadow: 0 4px 16px rgba(201,153,58,.2);
|
| 410 |
}
|
| 411 |
+
.send-btn:disabled { opacity: .3; cursor: not-allowed; transform: none; }
|
| 412 |
|
| 413 |
+
.input-hint { font-size: .7rem; color: var(--muted); margin-top: 7px; text-align: center; font-style: italic; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 414 |
</style>
|
| 415 |
</head>
|
| 416 |
|
|
|
|
| 428 |
<span class="badge badge-bible">Bible</span>
|
| 429 |
<span class="badge badge-granth">Guru Granth Sahib</span>
|
| 430 |
</div>
|
| 431 |
+
|
| 432 |
+
<!-- Session status bar β visible once conversation starts -->
|
| 433 |
+
<div class="session-bar" id="sessionBar">
|
| 434 |
+
<span class="session-turn-count" id="turnCountLabel">
|
| 435 |
+
Turn <span id="turnCount">0</span>
|
| 436 |
+
</span>
|
| 437 |
+
<button class="new-convo-btn" onclick="startNewConversation()" title="Clear history and start fresh">
|
| 438 |
+
βΊ New Conversation
|
| 439 |
+
</button>
|
| 440 |
+
</div>
|
| 441 |
</header>
|
| 442 |
|
| 443 |
<!-- Chat Window -->
|
|
|
|
| 445 |
<div class="welcome" id="welcomePane">
|
| 446 |
<div class="welcome-icon">ποΈ</div>
|
| 447 |
<h2>"Seek, and it shall be given unto you"</h2>
|
| 448 |
+
<p>Ask any spiritual or philosophical question. Answers are drawn exclusively from the
|
| 449 |
+
Bhagavad Gita, Quran, Bible, and Guru Granth Sahib.<br><br>
|
| 450 |
+
<em style="color:var(--gold-light); font-size:.9rem;">
|
| 451 |
+
You can now ask follow-up questions β the guide remembers the conversation.
|
| 452 |
+
</em>
|
| 453 |
+
</p>
|
| 454 |
<div class="suggested-queries">
|
| 455 |
<button onclick="askSuggested(this)">What do the scriptures say about forgiveness?</button>
|
| 456 |
<button onclick="askSuggested(this)">How should one face fear and death?</button>
|
| 457 |
<button onclick="askSuggested(this)">What is the purpose of prayer and worship?</button>
|
| 458 |
<button onclick="askSuggested(this)">What is the nature of the soul according to each religion?</button>
|
| 459 |
+
<button onclick="askSuggested(this)">What do the scriptures teach about humility and selfless service?</button>
|
|
|
|
| 460 |
</div>
|
| 461 |
</div>
|
| 462 |
</div>
|
|
|
|
| 464 |
<!-- Input -->
|
| 465 |
<div class="input-area">
|
| 466 |
<div class="input-row">
|
| 467 |
+
<textarea id="questionInput"
|
| 468 |
+
placeholder="Ask a question, or follow up on the previous answerβ¦"
|
| 469 |
+
rows="1"
|
| 470 |
+
onkeydown="handleKey(event)"
|
| 471 |
+
oninput="autoResize(this)"></textarea>
|
| 472 |
+
<button class="send-btn" id="sendBtn" onclick="sendQuestion()" title="Ask (Enter)">β¦</button>
|
| 473 |
</div>
|
| 474 |
+
<p class="input-hint">Enter to ask Β· Shift+Enter for new line Β· Follow-ups like "elaborate on point 2" work!</p>
|
|
|
|
| 475 |
</div>
|
| 476 |
|
| 477 |
</div>
|
| 478 |
|
| 479 |
<script>
|
| 480 |
const API_BASE = window.location.origin;
|
| 481 |
+
let isLoading = false;
|
| 482 |
+
let sessionId = null; // persisted across the page session
|
| 483 |
+
let turnCount = 0; // how many full turns this session
|
| 484 |
+
|
| 485 |
+
// ββ Session helpers ββββββββββββββββββββββββββββββββββββββββ
|
| 486 |
+
function loadSession() {
|
| 487 |
+
sessionId = localStorage.getItem("rag_session_id") || null;
|
| 488 |
+
}
|
| 489 |
+
|
| 490 |
+
function saveSession(id) {
|
| 491 |
+
sessionId = id;
|
| 492 |
+
localStorage.setItem("rag_session_id", id);
|
| 493 |
+
}
|
| 494 |
|
| 495 |
+
function updateSessionBar() {
|
| 496 |
+
const bar = document.getElementById("sessionBar");
|
| 497 |
+
const count = document.getElementById("turnCount");
|
| 498 |
+
if (turnCount > 0) {
|
| 499 |
+
bar.classList.add("visible");
|
| 500 |
+
count.textContent = turnCount;
|
| 501 |
+
} else {
|
| 502 |
+
bar.classList.remove("visible");
|
| 503 |
+
}
|
| 504 |
+
}
|
| 505 |
+
|
| 506 |
+
async function startNewConversation() {
|
| 507 |
+
if (!sessionId) return;
|
| 508 |
+
if (turnCount > 0 && !confirm("Start a new conversation? This will clear all history.")) return;
|
| 509 |
+
|
| 510 |
+
try {
|
| 511 |
+
await fetch(`${API_BASE}/clear`, {
|
| 512 |
+
method: "POST",
|
| 513 |
+
headers: { "Content-Type": "application/json" },
|
| 514 |
+
body: JSON.stringify({ session_id: sessionId }),
|
| 515 |
+
});
|
| 516 |
+
} catch (_) {}
|
| 517 |
+
|
| 518 |
+
// Reset everything
|
| 519 |
+
sessionId = null;
|
| 520 |
+
turnCount = 0;
|
| 521 |
+
localStorage.removeItem("rag_session_id");
|
| 522 |
+
updateSessionBar();
|
| 523 |
+
|
| 524 |
+
const chatWindow = document.getElementById("chatWindow");
|
| 525 |
+
chatWindow.innerHTML = `
|
| 526 |
+
<div class="welcome" id="welcomePane">
|
| 527 |
+
<div class="welcome-icon">ποΈ</div>
|
| 528 |
+
<h2>"Seek, and it shall be given unto you"</h2>
|
| 529 |
+
<p>Ask any spiritual or philosophical question. Answers are drawn exclusively from the
|
| 530 |
+
Bhagavad Gita, Quran, Bible, and Guru Granth Sahib.<br><br>
|
| 531 |
+
<em style="color:var(--gold-light); font-size:.9rem;">
|
| 532 |
+
You can now ask follow-up questions β the guide remembers the conversation.
|
| 533 |
+
</em>
|
| 534 |
+
</p>
|
| 535 |
+
<div class="suggested-queries">
|
| 536 |
+
<button onclick="askSuggested(this)">What do the scriptures say about forgiveness?</button>
|
| 537 |
+
<button onclick="askSuggested(this)">How should one face fear and death?</button>
|
| 538 |
+
<button onclick="askSuggested(this)">What is the purpose of prayer and worship?</button>
|
| 539 |
+
<button onclick="askSuggested(this)">What is the nature of the soul according to each religion?</button>
|
| 540 |
+
<button onclick="askSuggested(this)">What do the scriptures teach about humility and selfless service?</button>
|
| 541 |
+
</div>
|
| 542 |
+
</div>`;
|
| 543 |
+
}
|
| 544 |
+
|
| 545 |
+
// ββ DOM Helpers ββββββββββββββββββββββββββββββββββββββββββββ
|
| 546 |
function getSourceClass(book) {
|
| 547 |
const b = book.toLowerCase();
|
| 548 |
+
if (b.includes("gita")) return "source-gita";
|
| 549 |
if (b.includes("quran") || b.includes("koran")) return "source-quran";
|
| 550 |
if (b.includes("bible") || b.includes("testament")) return "source-bible";
|
| 551 |
if (b.includes("granth") || b.includes("guru")) return "source-granth";
|
|
|
|
| 564 |
|
| 565 |
function autoResize(el) {
|
| 566 |
el.style.height = "auto";
|
| 567 |
+
el.style.height = Math.min(el.scrollHeight, 130) + "px";
|
| 568 |
}
|
| 569 |
|
| 570 |
function formatAnswer(text) {
|
|
|
|
| 571 |
text = text.replace(/\*\*(.*?)\*\*/g, "<strong>$1</strong>");
|
|
|
|
| 572 |
return text.split(/\n\n+/).filter(p => p.trim()).map(p => `<p>${p.trim()}</p>`).join("");
|
| 573 |
}
|
| 574 |
|
| 575 |
+
function escapeHtml(str) {
|
| 576 |
+
return str.replace(/&/g, "&").replace(/</g, "<").replace(/>/g, ">");
|
| 577 |
+
}
|
| 578 |
+
|
| 579 |
+
// ββ Message rendering ββββββββββββββββββββββββββββββββββββββ
|
| 580 |
+
function appendUserMessage(question, isFollowup) {
|
| 581 |
const w = document.getElementById("chatWindow");
|
| 582 |
const div = document.createElement("div");
|
| 583 |
div.className = "message message-user";
|
| 584 |
+
const pill = isFollowup
|
| 585 |
+
? `<span class="followup-pill">follow-up</span>`
|
| 586 |
+
: "";
|
| 587 |
div.innerHTML = `
|
| 588 |
+
<span class="msg-label">You${pill}</span>
|
| 589 |
<div class="msg-bubble">${escapeHtml(question)}</div>
|
| 590 |
`;
|
| 591 |
w.appendChild(div);
|
|
|
|
| 609 |
return div;
|
| 610 |
}
|
| 611 |
|
| 612 |
+
function renderSourcesInPlace(container, sources) {
|
| 613 |
+
const sourceTags = (sources || []).map(s => {
|
|
|
|
|
|
|
|
|
|
| 614 |
const cls = getSourceClass(s.book);
|
| 615 |
+
return `<span class="source-tag ${cls}" title="${escapeHtml(s.snippet || '')}">π ${escapeHtml(s.book)}</span>`;
|
| 616 |
}).join("");
|
| 617 |
+
if (sourceTags) {
|
| 618 |
+
container.innerHTML = `
|
| 619 |
+
<div class="sources">
|
| 620 |
+
<div class="sources-label">Citations</div>
|
| 621 |
+
<div class="source-tags">${sourceTags}</div>
|
| 622 |
+
</div>`;
|
| 623 |
+
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 624 |
}
|
| 625 |
|
| 626 |
+
// ββ Core send flow βββββββββββββββββββββββββββββββββββββββββ
|
| 627 |
async function sendQuestion() {
|
| 628 |
if (isLoading) return;
|
| 629 |
+
const input = document.getElementById("questionInput");
|
| 630 |
const question = input.value.trim();
|
| 631 |
if (!question) return;
|
| 632 |
|
| 633 |
hideWelcome();
|
| 634 |
+
const isFollowup = turnCount > 0;
|
| 635 |
+
|
| 636 |
isLoading = true;
|
| 637 |
document.getElementById("sendBtn").disabled = true;
|
| 638 |
input.value = "";
|
| 639 |
input.style.height = "auto";
|
| 640 |
|
| 641 |
+
appendUserMessage(question, isFollowup);
|
| 642 |
const loadingEl = appendLoading();
|
| 643 |
|
| 644 |
try {
|
| 645 |
+
const payload = { question };
|
| 646 |
+
if (sessionId) payload.session_id = sessionId;
|
| 647 |
+
|
| 648 |
const res = await fetch(`${API_BASE}/ask`, {
|
| 649 |
+
method: "POST",
|
| 650 |
headers: { "Content-Type": "application/json" },
|
| 651 |
+
body: JSON.stringify(payload),
|
| 652 |
});
|
| 653 |
|
| 654 |
if (!res.ok) {
|
|
|
|
| 656 |
throw new Error(err.detail || "Server error");
|
| 657 |
}
|
| 658 |
|
| 659 |
+
// Capture session ID returned by the server
|
| 660 |
+
const returnedSession = res.headers.get("X-Session-Id");
|
| 661 |
+
if (returnedSession) saveSession(returnedSession);
|
|
|
|
|
|
|
| 662 |
|
| 663 |
+
// Set up streaming bubble
|
| 664 |
loadingEl.innerHTML = `
|
| 665 |
+
<span class="msg-label">Sacred Texts</span>
|
| 666 |
+
<div class="msg-bubble" id="currentStreamingMsg">
|
| 667 |
+
<div class="loading-text">The scriptures are being revealed
|
| 668 |
+
<span class="thinking-dots"><span></span><span></span><span></span></span>
|
| 669 |
+
</div>
|
| 670 |
+
</div>
|
| 671 |
+
<div id="currentStreamingSources"></div>`;
|
| 672 |
+
|
| 673 |
+
const bubble = document.getElementById("currentStreamingMsg");
|
| 674 |
const sourcesContainer = document.getElementById("currentStreamingSources");
|
| 675 |
+
let fullAnswer = "";
|
| 676 |
+
let buffer = "";
|
| 677 |
+
let firstToken = false;
|
| 678 |
+
|
| 679 |
+
const reader = res.body.getReader();
|
| 680 |
+
const decoder = new TextDecoder();
|
| 681 |
|
| 682 |
while (true) {
|
| 683 |
const { done, value } = await reader.read();
|
| 684 |
if (done) break;
|
| 685 |
|
|
|
|
| 686 |
buffer += decoder.decode(value, { stream: true });
|
|
|
|
|
|
|
| 687 |
const lines = buffer.split("\n");
|
| 688 |
+
buffer = lines.pop(); // keep incomplete line in buffer
|
|
|
|
|
|
|
| 689 |
|
| 690 |
for (const line of lines) {
|
| 691 |
if (!line.trim()) continue;
|
|
|
|
| 693 |
const parsed = JSON.parse(line);
|
| 694 |
|
| 695 |
if (parsed.type === "token") {
|
| 696 |
+
if (!firstToken) { bubble.innerHTML = ""; firstToken = true; }
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 697 |
fullAnswer += parsed.data;
|
|
|
|
| 698 |
bubble.innerHTML = formatAnswer(fullAnswer);
|
| 699 |
scrollToBottom();
|
| 700 |
}
|
| 701 |
else if (parsed.type === "sources") {
|
| 702 |
+
renderSourcesInPlace(sourcesContainer, parsed.data);
|
|
|
|
| 703 |
}
|
| 704 |
else if (parsed.type === "cache") {
|
| 705 |
bubble.innerHTML = formatAnswer(parsed.data.answer);
|
|
|
|
| 707 |
scrollToBottom();
|
| 708 |
}
|
| 709 |
} catch (e) {
|
| 710 |
+
console.warn("Stream parse error:", e);
|
| 711 |
}
|
| 712 |
}
|
| 713 |
}
|
| 714 |
|
| 715 |
+
// Increment turn counter
|
| 716 |
+
turnCount++;
|
| 717 |
+
updateSessionBar();
|
| 718 |
+
|
| 719 |
+
// Clean up streaming IDs
|
| 720 |
bubble.removeAttribute("id");
|
| 721 |
sourcesContainer.removeAttribute("id");
|
| 722 |
|
| 723 |
} catch (err) {
|
| 724 |
+
loadingEl.innerHTML = `
|
| 725 |
+
<span class="msg-label">Error</span>
|
| 726 |
+
<div class="error-bubble">β οΈ ${escapeHtml(err.message)}</div>`;
|
| 727 |
+
scrollToBottom();
|
| 728 |
} finally {
|
| 729 |
isLoading = false;
|
| 730 |
document.getElementById("sendBtn").disabled = false;
|
|
|
|
| 732 |
}
|
| 733 |
}
|
| 734 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 735 |
function askSuggested(btn) {
|
| 736 |
const input = document.getElementById("questionInput");
|
| 737 |
+
input.value = btn.textContent.trim();
|
| 738 |
autoResize(input);
|
| 739 |
sendQuestion();
|
| 740 |
}
|
|
|
|
| 745 |
sendQuestion();
|
| 746 |
}
|
| 747 |
}
|
| 748 |
+
|
| 749 |
+
// ββ Init βββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 750 |
+
loadSession();
|
| 751 |
</script>
|
| 752 |
</body>
|
|
|
|
| 753 |
</html>
|
rag_chain.py
CHANGED
|
@@ -1,43 +1,38 @@
|
|
| 1 |
"""
|
| 2 |
-
rag_chain.py β Core RAG chain using LangChain +
|
| 3 |
-
|
| 4 |
-
KEY
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
"answer": "...",
|
| 14 |
-
"sources": [
|
| 15 |
-
{"book": "Bhagavad Gita", "page": 42, "snippet": "..."},
|
| 16 |
-
...
|
| 17 |
-
]
|
| 18 |
-
}
|
| 19 |
"""
|
| 20 |
|
| 21 |
import os
|
| 22 |
-
|
| 23 |
from dotenv import load_dotenv
|
| 24 |
from langchain_nvidia_ai_endpoints import NVIDIAEmbeddings, ChatNVIDIA, NVIDIARerank
|
| 25 |
from langchain_chroma import Chroma
|
| 26 |
-
from langchain_core.prompts import ChatPromptTemplate
|
| 27 |
from langchain_core.output_parsers import StrOutputParser
|
|
|
|
| 28 |
from langchain_community.retrievers import BM25Retriever
|
| 29 |
from langchain_classic.retrievers import EnsembleRetriever, ContextualCompressionRetriever
|
| 30 |
-
|
| 31 |
-
import json
|
| 32 |
|
| 33 |
-
|
| 34 |
-
CHROMA_DB_PATH = os.getenv("CHROMA_DB_PATH", "./chroma_db")
|
| 35 |
-
COLLECTION_NAME = os.getenv("COLLECTION_NAME", "sacred_texts")
|
| 36 |
|
| 37 |
-
|
| 38 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 39 |
|
| 40 |
-
# All books currently in the knowledge base β add new books here as you ingest them
|
| 41 |
KNOWN_BOOKS = [
|
| 42 |
"Bhagavad Gita",
|
| 43 |
"Quran",
|
|
@@ -45,8 +40,32 @@ KNOWN_BOOKS = [
|
|
| 45 |
"Guru Granth Sahib",
|
| 46 |
]
|
| 47 |
|
| 48 |
-
#
|
| 49 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 50 |
|
| 51 |
# βββ System Prompt ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 52 |
|
|
@@ -62,6 +81,10 @@ STRICT RULES you must ALWAYS follow:
|
|
| 62 |
address EACH of those books separately, then synthesise the common thread.
|
| 63 |
6. Be respectful and neutral toward all faiths β treat each text with equal reverence.
|
| 64 |
7. Do NOT speculate, invent verses, or add information beyond the context.
|
|
|
|
|
|
|
|
|
|
|
|
|
| 65 |
|
| 66 |
FORMAT your response as:
|
| 67 |
- A clear, thoughtful answer (2β4 paragraphs)
|
|
@@ -73,8 +96,6 @@ Context passages from the sacred texts (guaranteed passages from each book):
|
|
| 73 |
ββββββββββββββββββββββββββββββββββββββββ
|
| 74 |
"""
|
| 75 |
|
| 76 |
-
HUMAN_PROMPT = "Question: {question}"
|
| 77 |
-
|
| 78 |
|
| 79 |
# βββ Embeddings & Vector Store ββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 80 |
|
|
@@ -94,42 +115,17 @@ def get_vector_store(embeddings):
|
|
| 94 |
)
|
| 95 |
|
| 96 |
|
| 97 |
-
# βββ Per-Book Retrieval ββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 98 |
-
|
| 99 |
-
def get_reranked_retriever(base_retriever):
|
| 100 |
-
"""
|
| 101 |
-
Wraps your Hybrid/Per-Book retriever with a Reranking layer.
|
| 102 |
-
"""
|
| 103 |
-
# 1. Initialize the NVIDIA Reranker (NIM or API Catalog)
|
| 104 |
-
# Using nvidia/llama-3.2-nv-rerankqa-1b-v2 or similar
|
| 105 |
-
reranker = NVIDIARerank(
|
| 106 |
-
model="nvidia/llama-3.2-nv-rerankqa-1b-v2",
|
| 107 |
-
api_key=NVIDIA_API_KEY,
|
| 108 |
-
top_n=5 # Only send the top 5 most relevant chunks to the LLM
|
| 109 |
-
)
|
| 110 |
-
|
| 111 |
-
# 2. Wrap the base retriever
|
| 112 |
-
compression_retriever = ContextualCompressionRetriever(
|
| 113 |
-
base_compressor=reranker,
|
| 114 |
-
base_retriever=base_retriever
|
| 115 |
-
)
|
| 116 |
-
|
| 117 |
-
return compression_retriever
|
| 118 |
|
| 119 |
def retrieve_per_book(question: str, vector_store: Chroma) -> list:
|
| 120 |
"""
|
| 121 |
-
Retrieve CHUNKS_PER_BOOK chunks from EACH known book independently
|
| 122 |
-
|
| 123 |
-
in the context β no book can be crowded out by higher-scoring chunks
|
| 124 |
-
from another book.
|
| 125 |
"""
|
| 126 |
all_candidates = []
|
| 127 |
-
|
| 128 |
-
# Detect if user is asking about a specific book
|
| 129 |
-
target_books = []
|
| 130 |
question_lower = question.lower()
|
| 131 |
-
|
| 132 |
-
|
| 133 |
if any(kw in question_lower for kw in ["gita", "bhagavad", "hindu", "hinduism"]):
|
| 134 |
target_books.append("Bhagavad Gita")
|
| 135 |
if any(kw in question_lower for kw in ["quran", "koran", "islam", "muslim", "muhammad"]):
|
|
@@ -138,63 +134,52 @@ def retrieve_per_book(question: str, vector_store: Chroma) -> list:
|
|
| 138 |
target_books.append("Bible")
|
| 139 |
if any(kw in question_lower for kw in ["granth", "guru", "sikh", "sikhism", "nanak"]):
|
| 140 |
target_books.append("Guru Granth Sahib")
|
| 141 |
-
|
| 142 |
-
# If no specific book is detected, use all books
|
| 143 |
books_to_search = target_books if target_books else KNOWN_BOOKS
|
| 144 |
-
|
| 145 |
print(f"π― Routing query to: {books_to_search}")
|
| 146 |
-
|
|
|
|
|
|
|
| 147 |
for book in books_to_search:
|
| 148 |
try:
|
| 149 |
-
# Increase k for the base retrieval to 10
|
| 150 |
-
CANDIDATE_COUNT = 10
|
| 151 |
-
|
| 152 |
-
# Get the full collection of documents for this book to build BM25
|
| 153 |
-
# For small demo, we can pull into memory; for larger corpora, consider a more efficient approach
|
| 154 |
book_data = vector_store.get(where={"book": book})
|
| 155 |
-
book_docs = [
|
| 156 |
-
|
| 157 |
-
|
| 158 |
-
|
| 159 |
if not book_docs:
|
| 160 |
continue
|
| 161 |
-
|
| 162 |
-
|
| 163 |
-
# Setup BM25
|
| 164 |
bm25_retriever = BM25Retriever.from_documents(book_docs)
|
| 165 |
bm25_retriever.k = CANDIDATE_COUNT
|
| 166 |
-
|
| 167 |
-
|
| 168 |
-
|
| 169 |
-
|
| 170 |
-
|
| 171 |
-
|
| 172 |
-
|
| 173 |
-
|
| 174 |
-
|
| 175 |
-
|
| 176 |
-
book_candidates =
|
| 177 |
all_candidates.extend(book_candidates)
|
| 178 |
-
print(f" π¦ {book}:
|
| 179 |
-
|
| 180 |
except Exception as e:
|
| 181 |
print(f" β {book}: retrieval error β {e}")
|
| 182 |
-
|
| 183 |
-
|
| 184 |
-
# Rerank the entire pool at once
|
| 185 |
if not all_candidates:
|
| 186 |
return []
|
| 187 |
-
|
| 188 |
print(f"π Reranking {len(all_candidates)} total candidates...")
|
| 189 |
reranker = NVIDIARerank(
|
| 190 |
-
model="nvidia/llama-3.2-nv-rerankqa-1b-v2",
|
| 191 |
api_key=NVIDIA_API_KEY,
|
| 192 |
-
top_n=5
|
| 193 |
)
|
| 194 |
-
|
| 195 |
-
|
| 196 |
-
final_docs = reranker.compress_documents(all_candidates, question)
|
| 197 |
-
|
| 198 |
for i, doc in enumerate(final_docs):
|
| 199 |
score = doc.metadata.get("relevance_score", "N/A")
|
| 200 |
print(f"Rank {i+1} [{doc.metadata['book']}]: Score {score}")
|
|
@@ -205,11 +190,6 @@ def retrieve_per_book(question: str, vector_store: Chroma) -> list:
|
|
| 205 |
# βββ Format Retrieved Docs ββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 206 |
|
| 207 |
def format_docs(docs: list) -> str:
|
| 208 |
-
"""
|
| 209 |
-
Format retrieved documents grouped by book for clarity.
|
| 210 |
-
Each chunk is labelled with book and page number.
|
| 211 |
-
"""
|
| 212 |
-
# Group by book to keep context readable
|
| 213 |
by_book: dict[str, list] = {}
|
| 214 |
for doc in docs:
|
| 215 |
book = doc.metadata.get("book", "Unknown")
|
|
@@ -220,19 +200,16 @@ def format_docs(docs: list) -> str:
|
|
| 220 |
header = f"βββ {book} βββ"
|
| 221 |
chunks = []
|
| 222 |
for i, doc in enumerate(book_docs, 1):
|
| 223 |
-
page = doc.metadata.get("page", "?")
|
| 224 |
-
ch = doc.metadata.get("chapter")
|
| 225 |
-
vs = doc.metadata.get("verse")
|
| 226 |
ang = doc.metadata.get("ang")
|
| 227 |
-
|
| 228 |
-
|
| 229 |
if ang:
|
| 230 |
citation = f"Ang {ang}"
|
| 231 |
elif ch and vs:
|
| 232 |
citation = f"{ch}:{vs}"
|
| 233 |
else:
|
| 234 |
citation = f"Page {doc.metadata.get('page', '?')}"
|
| 235 |
-
chunks.append(f" [{i}] ({citation}): {doc.page_content.strip()}")
|
| 236 |
sections.append(header + "\n" + "\n\n".join(chunks))
|
| 237 |
|
| 238 |
return "\n\n".join(sections)
|
|
@@ -241,8 +218,7 @@ def format_docs(docs: list) -> str:
|
|
| 241 |
# βββ Build the RAG Chain ββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 242 |
|
| 243 |
def build_chain():
|
| 244 |
-
|
| 245 |
-
embeddings = get_embeddings()
|
| 246 |
vector_store = get_vector_store(embeddings)
|
| 247 |
|
| 248 |
llm = ChatNVIDIA(
|
|
@@ -253,137 +229,141 @@ def build_chain():
|
|
| 253 |
max_output_tokens=2048,
|
| 254 |
)
|
| 255 |
|
|
|
|
| 256 |
prompt = ChatPromptTemplate.from_messages([
|
| 257 |
("system", SYSTEM_PROMPT),
|
| 258 |
-
("
|
|
|
|
| 259 |
])
|
| 260 |
|
| 261 |
-
# Chain: prompt β LLM β string output
|
| 262 |
-
# (retrieval is handled manually in query_sacred_texts for per-book control)
|
| 263 |
llm_chain = prompt | llm | StrOutputParser()
|
| 264 |
-
|
| 265 |
return llm_chain, vector_store
|
| 266 |
|
| 267 |
|
| 268 |
-
# βββ
|
| 269 |
|
| 270 |
-
_llm_chain
|
| 271 |
_vector_store = None
|
| 272 |
|
| 273 |
|
| 274 |
-
|
| 275 |
-
"""
|
| 276 |
-
Query the sacred texts knowledge base with guaranteed per-book retrieval.
|
| 277 |
|
| 278 |
-
|
| 279 |
-
|
|
|
|
|
|
|
| 280 |
|
| 281 |
-
|
| 282 |
-
{
|
| 283 |
-
|
| 284 |
-
|
| 285 |
-
}
|
| 286 |
"""
|
| 287 |
global _llm_chain, _vector_store
|
| 288 |
|
| 289 |
if _llm_chain is None:
|
| 290 |
print("π§ Initialising RAG chain (first call)...")
|
| 291 |
_llm_chain, _vector_store = build_chain()
|
| 292 |
-
|
| 293 |
-
# --- Semantic cache check ---
|
| 294 |
-
cache_coll = _vector_store._client.get_or_create_collection(CACHE_COLLECTION)
|
| 295 |
-
cache_results = cache_coll.query(
|
| 296 |
-
query_texts=[question],
|
| 297 |
-
n_results=1
|
| 298 |
-
)
|
| 299 |
|
| 300 |
-
|
| 301 |
-
|
| 302 |
-
|
| 303 |
-
|
| 304 |
-
|
| 305 |
-
|
| 306 |
-
|
| 307 |
-
|
| 308 |
-
|
| 309 |
-
|
| 310 |
-
|
| 311 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 312 |
|
| 313 |
if not source_docs:
|
| 314 |
yield json.dumps({"type": "token", "data": "No content found in the knowledge base."}) + "\n"
|
| 315 |
return
|
| 316 |
|
| 317 |
-
#
|
| 318 |
-
seen_sources = set()
|
| 319 |
sources = []
|
| 320 |
for doc in source_docs:
|
| 321 |
book = doc.metadata.get("book", "Unknown")
|
| 322 |
-
|
| 323 |
-
|
| 324 |
-
|
| 325 |
-
|
| 326 |
if ang:
|
| 327 |
cite_val = f"Ang {ang}"
|
| 328 |
elif ch and vs:
|
| 329 |
cite_val = f"{ch}:{vs}"
|
| 330 |
else:
|
| 331 |
cite_val = f"p. {doc.metadata.get('page', '?')}"
|
| 332 |
-
|
| 333 |
display_name = f"{book} {cite_val}"
|
| 334 |
snippet = doc.page_content[:200].strip() + "..."
|
| 335 |
if display_name not in seen_sources:
|
| 336 |
seen_sources.add(display_name)
|
| 337 |
-
print("Display name:", display_name)
|
| 338 |
-
print("Page:", cite_val)
|
| 339 |
sources.append({"book": display_name, "page": cite_val, "snippet": snippet})
|
| 340 |
-
# Step 2: Format context grouped by book
|
| 341 |
-
context = format_docs(source_docs)
|
| 342 |
-
full_answer =""
|
| 343 |
|
| 344 |
-
|
| 345 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 346 |
full_answer += chunk
|
| 347 |
-
yield json.dumps({"type": "token", "data": chunk}) + "\n"
|
| 348 |
-
|
| 349 |
-
|
| 350 |
-
|
| 351 |
-
final_sources = []
|
| 352 |
-
|
| 353 |
-
|
| 354 |
-
|
| 355 |
-
|
| 356 |
-
|
| 357 |
-
|
| 358 |
-
|
| 359 |
-
|
| 360 |
-
|
| 361 |
-
|
| 362 |
-
|
| 363 |
-
|
| 364 |
-
|
| 365 |
-
|
| 366 |
-
|
| 367 |
-
|
| 368 |
-
|
| 369 |
-
|
| 370 |
-
metadatas=[{"response_json": json.dumps(result)}],
|
| 371 |
-
ids=[question]
|
| 372 |
-
)
|
| 373 |
-
|
| 374 |
-
# Send sources as a final message after the answer is fully streamed
|
| 375 |
yield json.dumps({"type": "sources", "data": sources}) + "\n"
|
| 376 |
-
|
| 377 |
|
| 378 |
|
| 379 |
# βββ Quick CLI Test βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 380 |
|
| 381 |
if __name__ == "__main__":
|
| 382 |
-
test_q = "
|
| 383 |
print(f"\nπ Test query: {test_q}\n")
|
| 384 |
-
|
| 385 |
-
|
| 386 |
-
|
| 387 |
-
|
| 388 |
-
|
| 389 |
-
print(f" - {s['book']} (page {s['page']})")
|
|
|
|
| 1 |
"""
|
| 2 |
+
rag_chain.py β Core RAG chain using LangChain + NVIDIA.
|
| 3 |
+
|
| 4 |
+
KEY FEATURES:
|
| 5 |
+
- Per-book retrieval (guaranteed slots per scripture)
|
| 6 |
+
- Hybrid BM25 + vector search with NVIDIA reranking
|
| 7 |
+
- Semantic cache for repeated/similar questions
|
| 8 |
+
- Multi-turn conversation memory (session-based ConversationBufferMemory)
|
| 9 |
+
|
| 10 |
+
Public API:
|
| 11 |
+
query_sacred_texts(question, session_id) -> Generator[str, None, None]
|
| 12 |
+
clear_session(session_id)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
"""
|
| 14 |
|
| 15 |
import os
|
| 16 |
+
import json
|
| 17 |
from dotenv import load_dotenv
|
| 18 |
from langchain_nvidia_ai_endpoints import NVIDIAEmbeddings, ChatNVIDIA, NVIDIARerank
|
| 19 |
from langchain_chroma import Chroma
|
| 20 |
+
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
|
| 21 |
from langchain_core.output_parsers import StrOutputParser
|
| 22 |
+
from langchain_core.messages import HumanMessage, AIMessage
|
| 23 |
from langchain_community.retrievers import BM25Retriever
|
| 24 |
from langchain_classic.retrievers import EnsembleRetriever, ContextualCompressionRetriever
|
| 25 |
+
from langchain_core.documents import Document
|
|
|
|
| 26 |
|
| 27 |
+
load_dotenv()
|
|
|
|
|
|
|
| 28 |
|
| 29 |
+
NVIDIA_API_KEY = os.getenv("NVIDIA_API_KEY")
|
| 30 |
+
CHROMA_DB_PATH = os.getenv("CHROMA_DB_PATH", "./chroma_db")
|
| 31 |
+
COLLECTION_NAME = os.getenv("COLLECTION_NAME", "sacred_texts")
|
| 32 |
+
CHUNKS_PER_BOOK = int(os.getenv("CHUNKS_PER_BOOK", "3"))
|
| 33 |
+
CACHE_COLLECTION = "semantic_cache"
|
| 34 |
+
MAX_HISTORY_TURNS = int(os.getenv("MAX_HISTORY_TURNS", "6")) # last N human+AI pairs kept
|
| 35 |
|
|
|
|
| 36 |
KNOWN_BOOKS = [
|
| 37 |
"Bhagavad Gita",
|
| 38 |
"Quran",
|
|
|
|
| 40 |
"Guru Granth Sahib",
|
| 41 |
]
|
| 42 |
|
| 43 |
+
# βββ In-memory session store ββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 44 |
+
# { session_id: [HumanMessage | AIMessage, ...] }
|
| 45 |
+
_session_store: dict[str, list] = {}
|
| 46 |
+
|
| 47 |
+
|
| 48 |
+
def get_history(session_id: str) -> list:
|
| 49 |
+
return _session_store.get(session_id, [])
|
| 50 |
+
|
| 51 |
+
|
| 52 |
+
def append_turn(session_id: str, human_msg: str, ai_msg: str):
|
| 53 |
+
history = _session_store.setdefault(session_id, [])
|
| 54 |
+
history.append(HumanMessage(content=human_msg))
|
| 55 |
+
history.append(AIMessage(content=ai_msg))
|
| 56 |
+
# Trim to last MAX_HISTORY_TURNS pairs (each pair = 2 messages)
|
| 57 |
+
if len(history) > MAX_HISTORY_TURNS * 2:
|
| 58 |
+
_session_store[session_id] = history[-(MAX_HISTORY_TURNS * 2):]
|
| 59 |
+
|
| 60 |
+
|
| 61 |
+
def clear_session(session_id: str):
|
| 62 |
+
"""Wipe the conversation history for a session."""
|
| 63 |
+
_session_store.pop(session_id, None)
|
| 64 |
+
|
| 65 |
+
|
| 66 |
+
def list_sessions() -> list[str]:
|
| 67 |
+
return list(_session_store.keys())
|
| 68 |
+
|
| 69 |
|
| 70 |
# βββ System Prompt ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 71 |
|
|
|
|
| 81 |
address EACH of those books separately, then synthesise the common thread.
|
| 82 |
6. Be respectful and neutral toward all faiths β treat each text with equal reverence.
|
| 83 |
7. Do NOT speculate, invent verses, or add information beyond the context.
|
| 84 |
+
8. You have access to the conversation history. Use it to:
|
| 85 |
+
- Understand follow-up questions (e.g. "elaborate on the second point", "what about the Bible?")
|
| 86 |
+
- Maintain continuity across turns without repeating yourself unnecessarily
|
| 87 |
+
- Resolve pronouns and references ("it", "that teaching", "the verse you mentioned") from history
|
| 88 |
|
| 89 |
FORMAT your response as:
|
| 90 |
- A clear, thoughtful answer (2β4 paragraphs)
|
|
|
|
| 96 |
ββββββββββββββββββββββββββββββββββββββββ
|
| 97 |
"""
|
| 98 |
|
|
|
|
|
|
|
| 99 |
|
| 100 |
# βββ Embeddings & Vector Store ββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 101 |
|
|
|
|
| 115 |
)
|
| 116 |
|
| 117 |
|
| 118 |
+
# βββ Per-Book Hybrid Retrieval ββββββββββββββββββββββββββββββββββββββββββββββββ
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 119 |
|
| 120 |
def retrieve_per_book(question: str, vector_store: Chroma) -> list:
|
| 121 |
"""
|
| 122 |
+
Retrieve CHUNKS_PER_BOOK chunks from EACH known book independently using
|
| 123 |
+
a hybrid BM25+vector ensemble, then rerank the pooled candidates.
|
|
|
|
|
|
|
| 124 |
"""
|
| 125 |
all_candidates = []
|
|
|
|
|
|
|
|
|
|
| 126 |
question_lower = question.lower()
|
| 127 |
+
|
| 128 |
+
target_books = []
|
| 129 |
if any(kw in question_lower for kw in ["gita", "bhagavad", "hindu", "hinduism"]):
|
| 130 |
target_books.append("Bhagavad Gita")
|
| 131 |
if any(kw in question_lower for kw in ["quran", "koran", "islam", "muslim", "muhammad"]):
|
|
|
|
| 134 |
target_books.append("Bible")
|
| 135 |
if any(kw in question_lower for kw in ["granth", "guru", "sikh", "sikhism", "nanak"]):
|
| 136 |
target_books.append("Guru Granth Sahib")
|
| 137 |
+
|
|
|
|
| 138 |
books_to_search = target_books if target_books else KNOWN_BOOKS
|
|
|
|
| 139 |
print(f"π― Routing query to: {books_to_search}")
|
| 140 |
+
|
| 141 |
+
CANDIDATE_COUNT = 10
|
| 142 |
+
|
| 143 |
for book in books_to_search:
|
| 144 |
try:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 145 |
book_data = vector_store.get(where={"book": book})
|
| 146 |
+
book_docs = [
|
| 147 |
+
Document(page_content=d, metadata=m)
|
| 148 |
+
for d, m in zip(book_data["documents"], book_data["metadatas"])
|
| 149 |
+
]
|
| 150 |
if not book_docs:
|
| 151 |
continue
|
| 152 |
+
|
|
|
|
|
|
|
| 153 |
bm25_retriever = BM25Retriever.from_documents(book_docs)
|
| 154 |
bm25_retriever.k = CANDIDATE_COUNT
|
| 155 |
+
|
| 156 |
+
vector_retriever = vector_store.as_retriever(
|
| 157 |
+
search_kwargs={"k": CANDIDATE_COUNT, "filter": {"book": book}}
|
| 158 |
+
)
|
| 159 |
+
|
| 160 |
+
ensemble = EnsembleRetriever(
|
| 161 |
+
retrievers=[bm25_retriever, vector_retriever],
|
| 162 |
+
weights=[0.5, 0.5],
|
| 163 |
+
)
|
| 164 |
+
|
| 165 |
+
book_candidates = ensemble.invoke(question)
|
| 166 |
all_candidates.extend(book_candidates)
|
| 167 |
+
print(f" π¦ {book}: {len(book_candidates)} candidates")
|
| 168 |
+
|
| 169 |
except Exception as e:
|
| 170 |
print(f" β {book}: retrieval error β {e}")
|
| 171 |
+
|
|
|
|
|
|
|
| 172 |
if not all_candidates:
|
| 173 |
return []
|
| 174 |
+
|
| 175 |
print(f"π Reranking {len(all_candidates)} total candidates...")
|
| 176 |
reranker = NVIDIARerank(
|
| 177 |
+
model="nvidia/llama-3.2-nv-rerankqa-1b-v2",
|
| 178 |
api_key=NVIDIA_API_KEY,
|
| 179 |
+
top_n=5,
|
| 180 |
)
|
| 181 |
+
final_docs = reranker.compress_documents(all_candidates, question)
|
| 182 |
+
|
|
|
|
|
|
|
| 183 |
for i, doc in enumerate(final_docs):
|
| 184 |
score = doc.metadata.get("relevance_score", "N/A")
|
| 185 |
print(f"Rank {i+1} [{doc.metadata['book']}]: Score {score}")
|
|
|
|
| 190 |
# βββ Format Retrieved Docs ββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 191 |
|
| 192 |
def format_docs(docs: list) -> str:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 193 |
by_book: dict[str, list] = {}
|
| 194 |
for doc in docs:
|
| 195 |
book = doc.metadata.get("book", "Unknown")
|
|
|
|
| 200 |
header = f"βββ {book} βββ"
|
| 201 |
chunks = []
|
| 202 |
for i, doc in enumerate(book_docs, 1):
|
|
|
|
|
|
|
|
|
|
| 203 |
ang = doc.metadata.get("ang")
|
| 204 |
+
ch = doc.metadata.get("chapter")
|
| 205 |
+
vs = doc.metadata.get("verse")
|
| 206 |
if ang:
|
| 207 |
citation = f"Ang {ang}"
|
| 208 |
elif ch and vs:
|
| 209 |
citation = f"{ch}:{vs}"
|
| 210 |
else:
|
| 211 |
citation = f"Page {doc.metadata.get('page', '?')}"
|
| 212 |
+
chunks.append(f" [{i}] ({citation}): {doc.page_content.strip()}")
|
| 213 |
sections.append(header + "\n" + "\n\n".join(chunks))
|
| 214 |
|
| 215 |
return "\n\n".join(sections)
|
|
|
|
| 218 |
# βββ Build the RAG Chain ββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 219 |
|
| 220 |
def build_chain():
|
| 221 |
+
embeddings = get_embeddings()
|
|
|
|
| 222 |
vector_store = get_vector_store(embeddings)
|
| 223 |
|
| 224 |
llm = ChatNVIDIA(
|
|
|
|
| 229 |
max_output_tokens=2048,
|
| 230 |
)
|
| 231 |
|
| 232 |
+
# Prompt now includes a chat-history placeholder so prior turns are visible
|
| 233 |
prompt = ChatPromptTemplate.from_messages([
|
| 234 |
("system", SYSTEM_PROMPT),
|
| 235 |
+
MessagesPlaceholder(variable_name="history"), # β injected per-request
|
| 236 |
+
("human", "{question}"),
|
| 237 |
])
|
| 238 |
|
|
|
|
|
|
|
| 239 |
llm_chain = prompt | llm | StrOutputParser()
|
|
|
|
| 240 |
return llm_chain, vector_store
|
| 241 |
|
| 242 |
|
| 243 |
+
# βββ Singleton init βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 244 |
|
| 245 |
+
_llm_chain = None
|
| 246 |
_vector_store = None
|
| 247 |
|
| 248 |
|
| 249 |
+
# βββ Public API βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
|
|
|
|
|
|
| 250 |
|
| 251 |
+
def query_sacred_texts(question: str, session_id: str = "default"):
|
| 252 |
+
"""
|
| 253 |
+
Stream an answer grounded in the sacred texts, maintaining per-session
|
| 254 |
+
conversation history for natural follow-up questions.
|
| 255 |
|
| 256 |
+
Yields JSON-lines of the form:
|
| 257 |
+
{"type": "token", "data": "<chunk>"}
|
| 258 |
+
{"type": "sources", "data": [...]}
|
| 259 |
+
{"type": "cache", "data": {"answer": "...", "sources": [...]}}
|
|
|
|
| 260 |
"""
|
| 261 |
global _llm_chain, _vector_store
|
| 262 |
|
| 263 |
if _llm_chain is None:
|
| 264 |
print("π§ Initialising RAG chain (first call)...")
|
| 265 |
_llm_chain, _vector_store = build_chain()
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 266 |
|
| 267 |
+
# ββ Semantic cache check (skip for follow-ups that reference history) ββ
|
| 268 |
+
history = get_history(session_id)
|
| 269 |
+
is_followup = len(history) > 0
|
| 270 |
+
|
| 271 |
+
if not is_followup:
|
| 272 |
+
cache_coll = _vector_store._client.get_or_create_collection(CACHE_COLLECTION)
|
| 273 |
+
cache_results = cache_coll.query(query_texts=[question], n_results=1)
|
| 274 |
+
|
| 275 |
+
THRESHOLD = 0.35
|
| 276 |
+
if cache_results["ids"] and cache_results["ids"][0]:
|
| 277 |
+
distance = cache_results["distances"][0][0]
|
| 278 |
+
if distance < THRESHOLD:
|
| 279 |
+
print(f"β‘οΈ Semantic Cache Hit! (Distance: {distance:.4f})")
|
| 280 |
+
cached = json.loads(cache_results["metadatas"][0][0]["response_json"])
|
| 281 |
+
# Store this cache hit in session memory too
|
| 282 |
+
append_turn(session_id, question, cached["answer"])
|
| 283 |
+
yield json.dumps({"type": "cache", "data": cached}) + "\n"
|
| 284 |
+
return
|
| 285 |
+
|
| 286 |
+
# ββ Retrieval ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 287 |
+
# For follow-ups, augment the question with the last human turn for better
|
| 288 |
+
# semantic search (the follow-up itself may be too short/vague)
|
| 289 |
+
retrieval_query = question
|
| 290 |
+
if is_followup and len(question.split()) < 8:
|
| 291 |
+
last_human = next(
|
| 292 |
+
(m.content for m in reversed(history) if isinstance(m, HumanMessage)), ""
|
| 293 |
+
)
|
| 294 |
+
retrieval_query = f"{last_human} {question}".strip()
|
| 295 |
+
print(f"π Follow-up detected β augmented retrieval query: '{retrieval_query}'")
|
| 296 |
+
|
| 297 |
+
print(f"\nπ Retrieving chunks for: '{retrieval_query}'")
|
| 298 |
+
source_docs = retrieve_per_book(retrieval_query, _vector_store)
|
| 299 |
|
| 300 |
if not source_docs:
|
| 301 |
yield json.dumps({"type": "token", "data": "No content found in the knowledge base."}) + "\n"
|
| 302 |
return
|
| 303 |
|
| 304 |
+
# ββ Build sources list βββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 305 |
+
seen_sources: set[str] = set()
|
| 306 |
sources = []
|
| 307 |
for doc in source_docs:
|
| 308 |
book = doc.metadata.get("book", "Unknown")
|
| 309 |
+
ang = doc.metadata.get("ang")
|
| 310 |
+
ch = doc.metadata.get("chapter")
|
| 311 |
+
vs = doc.metadata.get("verse")
|
|
|
|
| 312 |
if ang:
|
| 313 |
cite_val = f"Ang {ang}"
|
| 314 |
elif ch and vs:
|
| 315 |
cite_val = f"{ch}:{vs}"
|
| 316 |
else:
|
| 317 |
cite_val = f"p. {doc.metadata.get('page', '?')}"
|
|
|
|
| 318 |
display_name = f"{book} {cite_val}"
|
| 319 |
snippet = doc.page_content[:200].strip() + "..."
|
| 320 |
if display_name not in seen_sources:
|
| 321 |
seen_sources.add(display_name)
|
|
|
|
|
|
|
| 322 |
sources.append({"book": display_name, "page": cite_val, "snippet": snippet})
|
|
|
|
|
|
|
|
|
|
| 323 |
|
| 324 |
+
context = format_docs(source_docs)
|
| 325 |
+
full_answer = ""
|
| 326 |
+
|
| 327 |
+
# ββ Stream LLM response (history injected here) ββββββββββββββββββββββββ
|
| 328 |
+
for chunk in _llm_chain.stream({
|
| 329 |
+
"context": context,
|
| 330 |
+
"question": question,
|
| 331 |
+
"history": history, # β the conversation so far
|
| 332 |
+
}):
|
| 333 |
full_answer += chunk
|
| 334 |
+
yield json.dumps({"type": "token", "data": chunk}) + "\n"
|
| 335 |
+
|
| 336 |
+
# ββ Filter sources to those actually cited in the answer βββββββββββββββ
|
| 337 |
+
answer_lower = full_answer.lower()
|
| 338 |
+
final_sources = [s for s in sources if s["book"].lower() in answer_lower] or []
|
| 339 |
+
|
| 340 |
+
# ββ Persist this turn into session memory βββββββββββββββββββββββββββββ
|
| 341 |
+
append_turn(session_id, question, full_answer)
|
| 342 |
+
print(f"πΎ Session '{session_id}': {len(get_history(session_id)) // 2} turn(s) stored")
|
| 343 |
+
|
| 344 |
+
# ββ Cache first-turn answers only βββββββββββββββββββββββββββββββββββββ
|
| 345 |
+
if not is_followup:
|
| 346 |
+
result_to_cache = {"answer": full_answer, "sources": final_sources}
|
| 347 |
+
try:
|
| 348 |
+
cache_coll = _vector_store._client.get_or_create_collection(CACHE_COLLECTION)
|
| 349 |
+
cache_coll.add(
|
| 350 |
+
documents=[question],
|
| 351 |
+
metadatas=[{"response_json": json.dumps(result_to_cache)}],
|
| 352 |
+
ids=[question],
|
| 353 |
+
)
|
| 354 |
+
except Exception as e:
|
| 355 |
+
print(f"β οΈ Cache write failed: {e}")
|
| 356 |
+
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 357 |
yield json.dumps({"type": "sources", "data": sources}) + "\n"
|
|
|
|
| 358 |
|
| 359 |
|
| 360 |
# βββ Quick CLI Test βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 361 |
|
| 362 |
if __name__ == "__main__":
|
| 363 |
+
test_q = "What do the scriptures say about forgiveness?"
|
| 364 |
print(f"\nπ Test query: {test_q}\n")
|
| 365 |
+
for line in query_sacred_texts(test_q, session_id="cli-test"):
|
| 366 |
+
obj = json.loads(line)
|
| 367 |
+
if obj["type"] == "token":
|
| 368 |
+
print(obj["data"], end="", flush=True)
|
| 369 |
+
print("\n")
|
|
|