Spaces:
Sleeping
Sleeping
metadata
title: Enterprise Loan AI
emoji: π¦
colorFrom: yellow
colorTo: gray
sdk: docker
pinned: false
license: mit
π¦ Enterprise Loan AI: Adaptive Predictive Ecosystem
Enterprise-grade financial decision engine combining Hybrid ML Logic with Mistral Large 3 Generative Insights.
ποΈ 1. System Architecture
The ecosystem operates as a high-fidelity diagnostic pipeline, ensuring mathematical rigor before AI interpretation.
graph TD
User((User)) -->|Submit Application| UI(React/Vite Frontend)
UI -->|API POST /predict| API(FastAPI Backend)
subgraph "Hybrid Inference Engine"
API -->|1. Deterministic Map| ML(sklearn Random Forest)
ML -->|Probabilities| ADVISOR(NVIDIA NIM Mistral-3)
ADVISOR -->|2. Structured Narrative| RESPONSE(Final JSON Packet)
end
RESPONSE -->|Persistence| DB[(SQLite / PostgreSQL)]
RESPONSE -->|Visual Analytics| DASH(Radar & Distribution Charts)
DASH -->|Render| User
π 2. Repository Structure
βββ backend/ # FastAPI Application Source
β βββ db/ # Persistence & Data Models
β βββ logic/ # Deterministic ML Engine (sklearn)
β βββ services/ # LLM Advisor (NVIDIA NIM Integration)
β βββ main.py # API Entry Point & Startup Logic
β βββ requirements.txt # Python Dependencies
βββ data/ # Training datasets (CSV)
βββ frontend/ # React (Vite) Application
β βββ src/
β β βββ components/ # Modular UI Components (Charts, Form, Sidebar)
β β βββ App.jsx # State-Based Navigation & Layout
β β βββ index.css # Premium UI Design System
β βββ package.json # Node.js Dependencies
βββ Dockerfile # Multi-stage Production Build
βββ .dockerignore # Docker Build Exclusions
βββ .gitignore # Git Excluded Files
βββ .env.example # Environment Management Template
π 3. End-to-End Setup Guide
A. Local Development
1. Backend (Python 3.11+)
# Create Virtual Environment
python -m venv venv
.\venv\Scripts\activate
# Install Project as Editable Package
pip install -e .
# Start Server
uvicorn backend.main:app --reload --port 8000
2. Frontend (Node.js 20+)
cd frontend
npm install
npm run dev
B. Environment Configuration
Create a .env file in the root directory:
# Required for AI Analysis
NVIDIA_API_KEY=your_key_here
# Required for Persistent Cloud Backups (Optional)
HF_TOKEN=your_huggingface_write_token
HF_REPO_ID=your_username/your_space_name
C. Cloud Synchronization (New)
The system now features an Automated Cloud Sync service.
- When running in a Docker/Hugging Face environment with a valid
HF_TOKEN, the system will automatically back up your assessment history to your Space's repository every time you make a new prediction or clear the history. - This ensures your data persists even if the Space's ephemeral container is restarted.
π³ 4. Production Deployment (Hugging Face)
This project is optimized for deployment as a Hugging Face Space using the Docker runtime.
1. Build & Run Locally
# Build Image
docker build -t loan-prediction-app .
# Run Container (History persistence requires /app/data volume)
docker run -d -p 7860:7860 --name loan-app loan-prediction-app
2. Deploy to Hugging Face
- Create a new Space on huggingface.co selecting the Docker SDK.
- In the Space Settings:
- Add your
NVIDIA_API_KEYas a Secret. - (Optional) Enable Persistent Storage and mount to
/app/data.
- Add your
- Push your code. The Space will automatically build and launch your dashboard.
β 5. Platform Features
- Deterministic Math: Validated Random Forest scoring.
- AI Narrative Sub-Cards: Readable, point-by-point financial insights.
- Radar Comparisons: Real-time benchmarking against successful profiles.
- ChatGPT History Sidebar: Persistent task tracking with "Clear History" support.
Built for High-Trust Lending Environments.