--- title: Enterprise Loan AI emoji: 🏦 colorFrom: yellow colorTo: gray sdk: docker pinned: false license: mit --- # 🏦 Enterprise Loan AI: Adaptive Predictive Ecosystem **Enterprise-grade financial decision engine combining Hybrid ML Logic with Mistral Large 3 Generative Insights.** --- ## 🏗️ 1. System Architecture The ecosystem operates as a high-fidelity diagnostic pipeline, ensuring mathematical rigor before AI interpretation. ```mermaid graph TD User((User)) -->|Submit Application| UI(React/Vite Frontend) UI -->|API POST /predict| API(FastAPI Backend) subgraph "Hybrid Inference Engine" API -->|1. Deterministic Map| ML(sklearn Random Forest) ML -->|Probabilities| ADVISOR(NVIDIA NIM Mistral-3) ADVISOR -->|2. Structured Narrative| RESPONSE(Final JSON Packet) end RESPONSE -->|Persistence| DB[(SQLite / PostgreSQL)] RESPONSE -->|Visual Analytics| DASH(Radar & Distribution Charts) DASH -->|Render| User ``` --- ## 📂 2. Repository Structure ```text ├── backend/ # FastAPI Application Source │ ├── db/ # Persistence & Data Models │ ├── logic/ # Deterministic ML Engine (sklearn) │ ├── services/ # LLM Advisor (NVIDIA NIM Integration) │ ├── main.py # API Entry Point & Startup Logic │ └── requirements.txt # Python Dependencies ├── data/ # Training datasets (CSV) ├── frontend/ # React (Vite) Application │ ├── src/ │ │ ├── components/ # Modular UI Components (Charts, Form, Sidebar) │ │ ├── App.jsx # State-Based Navigation & Layout │ │ └── index.css # Premium UI Design System │ └── package.json # Node.js Dependencies ├── Dockerfile # Multi-stage Production Build ├── .dockerignore # Docker Build Exclusions ├── .gitignore # Git Excluded Files └── .env.example # Environment Management Template ``` --- ## 🚀 3. End-to-End Setup Guide ### **A. Local Development** #### **1. Backend (Python 3.11+)** ```powershell # Create Virtual Environment python -m venv venv .\venv\Scripts\activate # Install Project as Editable Package pip install -e . # Start Server uvicorn backend.main:app --reload --port 8000 ``` #### **2. Frontend (Node.js 20+)** ```powershell cd frontend npm install npm run dev ``` ### **B. Environment Configuration** Create a `.env` file in the root directory: ```env # Required for AI Analysis NVIDIA_API_KEY=your_key_here # Required for Persistent Cloud Backups (Optional) HF_TOKEN=your_huggingface_write_token HF_REPO_ID=your_username/your_space_name ``` ### **C. Cloud Synchronization (New)** The system now features an **Automated Cloud Sync** service. - When running in a Docker/Hugging Face environment with a valid `HF_TOKEN`, the system will automatically back up your assessment history to your Space's repository every time you make a new prediction or clear the history. - This ensures your data persists even if the Space's ephemeral container is restarted. --- ## 🐳 4. Production Deployment (Hugging Face) This project is optimized for deployment as a **Hugging Face Space** using the **Docker** runtime. ### **1. Build & Run Locally** ```powershell # Build Image docker build -t loan-prediction-app . # Run Container (History persistence requires /app/data volume) docker run -d -p 7860:7860 --name loan-app loan-prediction-app ``` ### **2. Deploy to Hugging Face** 1. Create a new Space on [huggingface.co](https://huggingface.co/spaces) selecting the **Docker** SDK. 2. In the Space **Settings**: - Add your `NVIDIA_API_KEY` as a **Secret**. - (Optional) Enable **Persistent Storage** and mount to `/app/data`. 3. Push your code. The Space will automatically build and launch your dashboard. --- ## ✅ 5. Platform Features - **Deterministic Math**: Validated Random Forest scoring. - **AI Narrative Sub-Cards**: Readable, point-by-point financial insights. - **Radar Comparisons**: Real-time benchmarking against successful profiles. - **ChatGPT History Sidebar**: Persistent task tracking with "Clear History" support. --- *Built for High-Trust Lending Environments.*