File size: 4,321 Bytes
cd8ee46
 
 
def7c89
cd8ee46
 
 
 
 
 
67c8aca
e93c178
67c8aca
18e853f
 
e93c178
67c8aca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e93c178
67c8aca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18e853f
 
 
67c8aca
18e853f
67c8aca
18e853f
67c8aca
 
 
 
 
18e853f
67c8aca
 
18e853f
67c8aca
 
 
18e853f
67c8aca
 
 
 
 
 
 
 
 
 
 
 
18e853f
67c8aca
 
 
 
18e853f
67c8aca
 
 
 
18e853f
 
 
67c8aca
e93c178
67c8aca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e93c178
18e853f
e93c178
67c8aca
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
---
title: Enterprise Loan AI
emoji: 🏦
colorFrom: yellow
colorTo: gray
sdk: docker
pinned: false
license: mit
---

# 🏦 Enterprise Loan AI: Adaptive Predictive Ecosystem

**Enterprise-grade financial decision engine combining Hybrid ML Logic with Mistral Large 3 Generative Insights.**

---

## πŸ—οΈ 1. System Architecture

The ecosystem operates as a high-fidelity diagnostic pipeline, ensuring mathematical rigor before AI interpretation.

```mermaid
graph TD
    User((User)) -->|Submit Application| UI(React/Vite Frontend)
    UI -->|API POST /predict| API(FastAPI Backend)
    
    subgraph "Hybrid Inference Engine"
        API -->|1. Deterministic Map| ML(sklearn Random Forest)
        ML -->|Probabilities| ADVISOR(NVIDIA NIM Mistral-3)
        ADVISOR -->|2. Structured Narrative| RESPONSE(Final JSON Packet)
    end
    
    RESPONSE -->|Persistence| DB[(SQLite / PostgreSQL)]
    RESPONSE -->|Visual Analytics| DASH(Radar & Distribution Charts)
    DASH -->|Render| User
```

---

## πŸ“‚ 2. Repository Structure

```text
β”œβ”€β”€ backend/              # FastAPI Application Source
β”‚   β”œβ”€β”€ db/               # Persistence & Data Models
β”‚   β”œβ”€β”€ logic/            # Deterministic ML Engine (sklearn)
β”‚   β”œβ”€β”€ services/         # LLM Advisor (NVIDIA NIM Integration)
β”‚   β”œβ”€β”€ main.py           # API Entry Point & Startup Logic
β”‚   └── requirements.txt  # Python Dependencies
β”œβ”€β”€ data/                 # Training datasets (CSV)
β”œβ”€β”€ frontend/             # React (Vite) Application
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ components/   # Modular UI Components (Charts, Form, Sidebar)
β”‚   β”‚   β”œβ”€β”€ App.jsx       # State-Based Navigation & Layout
β”‚   β”‚   └── index.css     # Premium UI Design System
β”‚   └── package.json      # Node.js Dependencies
β”œβ”€β”€ Dockerfile            # Multi-stage Production Build
β”œβ”€β”€ .dockerignore         # Docker Build Exclusions
β”œβ”€β”€ .gitignore            # Git Excluded Files
└── .env.example          # Environment Management Template
```

---

## πŸš€ 3. End-to-End Setup Guide

### **A. Local Development**

#### **1. Backend (Python 3.11+)**
```powershell
# Create Virtual Environment
python -m venv venv
.\venv\Scripts\activate

# Install Project as Editable Package
pip install -e .

# Start Server
uvicorn backend.main:app --reload --port 8000
```

#### **2. Frontend (Node.js 20+)**
```powershell
cd frontend
npm install
npm run dev
```

### **B. Environment Configuration**
Create a `.env` file in the root directory:
```env
# Required for AI Analysis
NVIDIA_API_KEY=your_key_here

# Required for Persistent Cloud Backups (Optional)
HF_TOKEN=your_huggingface_write_token
HF_REPO_ID=your_username/your_space_name
```

### **C. Cloud Synchronization (New)**
The system now features an **Automated Cloud Sync** service. 
- When running in a Docker/Hugging Face environment with a valid `HF_TOKEN`, the system will automatically back up your assessment history to your Space's repository every time you make a new prediction or clear the history.
- This ensures your data persists even if the Space's ephemeral container is restarted.

---

## 🐳 4. Production Deployment (Hugging Face)

This project is optimized for deployment as a **Hugging Face Space** using the **Docker** runtime.

### **1. Build & Run Locally**
```powershell
# Build Image
docker build -t loan-prediction-app .

# Run Container (History persistence requires /app/data volume)
docker run -d -p 7860:7860 --name loan-app loan-prediction-app
```

### **2. Deploy to Hugging Face**
1. Create a new Space on [huggingface.co](https://huggingface.co/spaces) selecting the **Docker** SDK.
2. In the Space **Settings**:
   - Add your `NVIDIA_API_KEY` as a **Secret**.
   - (Optional) Enable **Persistent Storage** and mount to `/app/data`.
3. Push your code. The Space will automatically build and launch your dashboard.

---

## βœ… 5. Platform Features
- **Deterministic Math**: Validated Random Forest scoring.
- **AI Narrative Sub-Cards**: Readable, point-by-point financial insights.
- **Radar Comparisons**: Real-time benchmarking against successful profiles.
- **ChatGPT History Sidebar**: Persistent task tracking with "Clear History" support.

---
*Built for High-Trust Lending Environments.*