# AI InterviewMentor — Architecture Overview ## System Architecture ``` Browser (React SPA) │ ▼ HF Spaces Docker Container (port 7860) ├── FastAPI (Python 3.11) │ ├── /api/auth/* → Custom JWT auth (jose + bcrypt) │ ├── /api/batches/* → Batch/class management │ ├── /api/topics/* → Topic CRUD + unlock control │ ├── /api/upload → CSV question bank ingestion │ ├── /api/sessions/* → Session detail + reports │ ├── /api/instructor/* → Dashboard + student analytics │ ├── /api/student/* → Student dashboard │ ├── /interview/* → LangGraph interview engine │ └── / (catch-all) → React static files (Vite build) │ ├── LangGraph Engine │ ├── ask_question → Picks next question from bank │ ├── evaluate_answer → Scores: strong | shallow | wrong │ ├── counter_question → Probing follow-up for shallow answers │ ├── summarize → Compresses conversation every 4 turns │ └── generate_report → Final scored feedback JSON │ └── External Services ├── NeonDB (PostgreSQL) │ ├── public schema → App data (users, batches, topics, questions, sessions) │ └── checkpointer → LangGraph state (auto-managed) └── OpenRouter API → MiniMax 2.7 model ``` ## Tech Stack | Layer | Technology | |-------|-----------| | Frontend | React 18 + Vite + TypeScript + Tailwind CSS v3 (dark theme only) | | Backend | FastAPI (Python 3.11) | | AI Orchestration | LangGraph (state machine with checkpointing) | | Database | NeonDB (PostgreSQL) — app data + LangGraph checkpoints | | AI Gateway | OpenRouter → MiniMax 2.7 | | Auth | Custom JWT (jose + bcrypt) — no third-party auth | | Deployment | Single Docker container on Hugging Face Spaces (port 7860) | ## Key Design Decisions 1. **Single container deployment** — React static build served by FastAPI, no split deployment 2. **Same-origin API calls** — No CORS needed, browser hits FastAPI directly 3. **Two DB schemas** — `public` for app data, `checkpointer` for LangGraph state 4. **Token budget control** — Summarize every 4 turns, ~700 token ceiling per LLM call 5. **session_id = thread_id** — Natural scoping between DB sessions and LangGraph state 6. **CSV-to-DB question ingestion** — No file storage needed (no S3/R2) 7. **Custom JWT only** — Supabase/Firebase explicitly banned (hackathon rules) ## Database Schema ### Tables - **users** — id, full_name, email, password_hash, role (student|instructor), batch_id - **refresh_tokens** — id, user_id, token_hash, expires_at - **batches** — id, name, instructor_id, class_code (auto-generated) - **topics** — id, batch_id, name, is_unlocked, order_index - **questions** — id, topic_id, question_text, difficulty (easy|medium|hard) - **interview_sessions** — id, student_id, topic_id, status, score, feedback (JSONB) ### Relationships ``` instructor (user) → creates batches → has topics → has questions student (user) → joins batch via class_code → takes interview_sessions on topics ``` ## Auth Flow - **Access token**: HS256 JWT, 15min expiry, sent in `Authorization: Bearer` header - **Refresh token**: 7-day expiry, bcrypt-hashed in DB, sent as httpOnly cookie - **Route protection**: `get_current_user` dependency extracts JWT → `require_instructor` / `require_student` role guards ## LangGraph Interview Flow ``` START → ask_question → [wait for student answer] → evaluate_answer │ ┌──────────────────────────┼──────────────┐ ▼ ▼ ▼ counter_question ask_question generate_report → END │ (or summarize ▼ every 4 turns) evaluate_answer ``` - **8 turns max** or question bank exhausted → generate_report - **Counter-questions** fire once per shallow answer (no double-countering) - **Summarize node** compresses messages every 4 turns to keep token usage flat ## Frontend Architecture - **Routing**: React Router with `ProtectedRoute` wrapper - **State**: Zustand stores — `authStore` (JWT + user), `interviewStore` (session state) - **API layer**: Single `apiFetch` wrapper with auto-refresh on 401 - **Pages**: Login, Signup, StudentDashboard, Interview, Report, InstructorDashboard, Upload, StudentDetail ## Scope Boundary (Not In V1) Voice input, email notifications, leaderboard, certificates, question bank editor UI, light theme, multi-instructor batches.