ysharma HF Staff commited on
Commit
1de5dd0
Β·
verified Β·
1 Parent(s): 54fc6f2

Upload app_v5.py

Browse files
Files changed (1) hide show
  1. app_v5.py +1711 -0
app_v5.py ADDED
@@ -0,0 +1,1711 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ PII Reveal - Document Privacy Explorer (v5)
3
+ ============================================
4
+ Changes from v4 (feedback):
5
+
6
+ 1. Light theme refresh
7
+ Crisper, higher-contrast neutral palette: bright white cards on a very
8
+ light grey body, near-black body text, and subdued text bumped one
9
+ stop darker so labels stop disappearing. Category highlight alpha
10
+ bumped from 12% to 16% on light, 15% on dark.
11
+
12
+ 2. PDF redaction export
13
+ New POST /api/redact-pdf endpoint. Accepts the original PDF plus the
14
+ list of spans + active labels the client is viewing, applies true
15
+ PyMuPDF redactions (black fill, underlying text removed β€” not just a
16
+ visual overlay), and streams the result back. The inspector gets an
17
+ "Export PDF" primary button when the input file is a PDF.
18
+
19
+ 3. Performance
20
+ Two code-level fixes for the 100k-token slowness on T4:
21
+
22
+ a) predict_text: dropped the unbind(0) -> list -> stack(0) roundtrip
23
+ in favour of a single torch.cat. It was allocating 100k separate
24
+ 33-wide tensors and re-stacking them for no reason.
25
+
26
+ b) Decoder.decode: the Viterbi loop is inherently sequential and
27
+ launches O(seq_len) CUDA kernels β€” on Turing (T4, compute 7.5)
28
+ kernel-launch overhead dominated because the state space is tiny
29
+ (33 classes). It now runs on CPU, which is bandwidth-bound on a
30
+ 33x33 matrix and completes in a couple of seconds for 100k tokens.
31
+
32
+ Also cached the Decoder itself with lru_cache (was being rebuilt
33
+ per request).
34
+
35
+ Hardware note: T4 is pre-Ampere and has no native bf16 support, so
36
+ every attention matmul is emulated. Code-level changes help the
37
+ decoder, but the model's attention pass will still be faster on L4 /
38
+ A10 / A100 by a large factor. The v5 fixes were validated against the
39
+ "is it code or hardware" question: both, but the decoder path was the
40
+ dominant contribution for a 100k document.
41
+ """
42
+
43
+ # ── stdlib ───────────────────────────────────────────────────────
44
+ import dataclasses
45
+ import functools
46
+ import io
47
+ import json
48
+ import math
49
+ import os
50
+ import re
51
+ import tempfile
52
+ from bisect import bisect_left, bisect_right
53
+ from collections.abc import Sequence
54
+ from dataclasses import dataclass
55
+ from pathlib import Path
56
+ from typing import Final
57
+
58
+ # ── third-party ──────────────────────────────────────────────────
59
+ import gradio as gr
60
+ import spaces
61
+ import tiktoken
62
+ import torch
63
+ import torch.nn.functional as F
64
+ from fastapi import File, Form, UploadFile
65
+ from fastapi.responses import HTMLResponse, JSONResponse, StreamingResponse
66
+ from huggingface_hub import snapshot_download
67
+ from safetensors import safe_open
68
+
69
+ # ── configuration ────────────────────────────────────────────────
70
+ MODEL_REPO = os.getenv("MODEL_ID", "charles-first-org/second-model")
71
+ HF_TOKEN = os.getenv("HF_TOKEN", None)
72
+ MODEL_DIR = Path(snapshot_download(MODEL_REPO, token=HF_TOKEN))
73
+
74
+ CATEGORIES_META = {
75
+ "private_person": {"color": "#E24B4A", "cls": "hp", "label": "Person", "mono": False},
76
+ "private_date": {"color": "#7F77DD", "cls": "hd", "label": "Date", "mono": True},
77
+ "private_address": {"color": "#1D9E75", "cls": "ha", "label": "Address", "mono": False},
78
+ "private_email": {"color": "#378ADD", "cls": "he", "label": "Email", "mono": True},
79
+ "account_number": {"color": "#BA7517", "cls": "hac", "label": "Account", "mono": True},
80
+ "private_url": {"color": "#D85A30", "cls": "hu", "label": "URL", "mono": True},
81
+ "secret": {"color": "#D4537E", "cls": "hs", "label": "Secret", "mono": True},
82
+ "private_phone": {"color": "#639922", "cls": "hph", "label": "Phone", "mono": True},
83
+ }
84
+
85
+ # =====================================================================
86
+ # MODEL ARCHITECTURE + INFERENCE (from reference implementation)
87
+ # =====================================================================
88
+
89
+ PRIVACY_FILTER_MODEL_TYPE: Final[str] = "privacy_filter"
90
+ REQUIRED_MODEL_CONFIG_KEYS: Final[tuple[str, ...]] = (
91
+ "model_type", "encoding", "num_hidden_layers", "num_experts",
92
+ "experts_per_token", "vocab_size", "num_labels", "hidden_size",
93
+ "intermediate_size", "head_dim", "num_attention_heads",
94
+ "num_key_value_heads", "sliding_window", "bidirectional_context",
95
+ "bidirectional_left_context", "bidirectional_right_context",
96
+ "default_n_ctx", "initial_context_length", "rope_theta",
97
+ "rope_scaling_factor", "rope_ntk_alpha", "rope_ntk_beta", "param_dtype",
98
+ )
99
+ BACKGROUND_CLASS_LABEL: Final[str] = "O"
100
+ BOUNDARY_PREFIXES: Final[tuple[str, ...]] = ("B", "I", "E", "S")
101
+ SPAN_CLASS_NAMES: Final[tuple[str, ...]] = (
102
+ BACKGROUND_CLASS_LABEL,
103
+ "account_number", "private_address", "private_date", "private_email",
104
+ "private_person", "private_phone", "private_url", "secret",
105
+ )
106
+ NER_CLASS_NAMES: Final[tuple[str, ...]] = (BACKGROUND_CLASS_LABEL,) + tuple(
107
+ f"{prefix}-{base}"
108
+ for base in SPAN_CLASS_NAMES if base != BACKGROUND_CLASS_LABEL
109
+ for prefix in BOUNDARY_PREFIXES
110
+ )
111
+ VITERBI_TRANSITION_BIAS_KEYS: Final[tuple[str, ...]] = (
112
+ "transition_bias_background_stay", "transition_bias_background_to_start",
113
+ "transition_bias_inside_to_continue", "transition_bias_inside_to_end",
114
+ "transition_bias_end_to_background", "transition_bias_end_to_start",
115
+ )
116
+ DEFAULT_VITERBI_CALIBRATION_PRESET: Final[str] = "default"
117
+
118
+
119
+ def validate_model_config_contract(cfg: dict, *, context: str) -> None:
120
+ missing = [k for k in REQUIRED_MODEL_CONFIG_KEYS if k not in cfg]
121
+ if missing:
122
+ raise ValueError(f"{context} missing keys: {', '.join(missing)}")
123
+ if cfg.get("model_type") != PRIVACY_FILTER_MODEL_TYPE:
124
+ raise ValueError(f"{context} model_type must be {PRIVACY_FILTER_MODEL_TYPE!r}")
125
+ if cfg.get("bidirectional_context") is not True:
126
+ raise ValueError(f"{context} must use bidirectional_context=true")
127
+ lc, rc = cfg.get("bidirectional_left_context"), cfg.get("bidirectional_right_context")
128
+ if not isinstance(lc, int) or not isinstance(rc, int) or lc != rc or lc < 0:
129
+ raise ValueError(f"{context} bidirectional context must be equal non-negative ints")
130
+ sw = cfg.get("sliding_window")
131
+ if sw != 2 * lc + 1:
132
+ raise ValueError(f"{context} sliding_window must equal 2*context+1")
133
+ if cfg["num_labels"] != 33:
134
+ raise ValueError(f"{context} num_labels must be 33")
135
+ if cfg["param_dtype"] != "bfloat16":
136
+ raise ValueError(f"{context} param_dtype must be bfloat16")
137
+
138
+
139
+ # ── model helpers ────────────────────────────────────────────────
140
+
141
+ def expert_linear(x: torch.Tensor, weight: torch.Tensor, bias: torch.Tensor | None) -> torch.Tensor:
142
+ n, e, k = x.shape
143
+ _, _, _, o = weight.shape
144
+ out = torch.bmm(x.reshape(n * e, 1, k), weight.reshape(n * e, k, o)).reshape(n, e, o)
145
+ return out + bias if bias is not None else out
146
+
147
+
148
+ @dataclass
149
+ class ModelConfig:
150
+ num_hidden_layers: int; num_experts: int; experts_per_token: int
151
+ vocab_size: int; num_labels: int; hidden_size: int; intermediate_size: int
152
+ head_dim: int; num_attention_heads: int; num_key_value_heads: int
153
+ bidirectional_context_size: int; initial_context_length: int
154
+ rope_theta: float; rope_scaling_factor: float; rope_ntk_alpha: float; rope_ntk_beta: float
155
+
156
+ @classmethod
157
+ def from_checkpoint_config(cls, cfg: dict, *, context: str) -> "ModelConfig":
158
+ cfg = dict(cfg)
159
+ cfg["bidirectional_context_size"] = cfg["bidirectional_left_context"]
160
+ fields = {f.name for f in dataclasses.fields(cls)}
161
+ return cls(**{k: v for k, v in cfg.items() if k in fields})
162
+
163
+
164
+ class RMSNorm(torch.nn.Module):
165
+ def __init__(self, n: int, eps: float = 1e-5, device=None):
166
+ super().__init__()
167
+ self.eps = eps
168
+ self.scale = torch.nn.Parameter(torch.ones(n, device=device, dtype=torch.float32))
169
+
170
+ def forward(self, x):
171
+ t = x.float()
172
+ return (t * torch.rsqrt(t.pow(2).mean(-1, keepdim=True) + self.eps) * self.scale).to(x.dtype)
173
+
174
+
175
+ def apply_rope(x, cos, sin):
176
+ cos = cos.unsqueeze(-2).to(x.dtype); sin = sin.unsqueeze(-2).to(x.dtype)
177
+ x1, x2 = x[..., ::2], x[..., 1::2]
178
+ return torch.stack((x1 * cos - x2 * sin, x2 * cos + x1 * sin), dim=-1).reshape(x.shape)
179
+
180
+
181
+ class RotaryEmbedding(torch.nn.Module):
182
+ def __init__(self, head_dim, base, dtype, *, initial_context_length=4096,
183
+ scaling_factor=1.0, ntk_alpha=1.0, ntk_beta=32.0, device=None):
184
+ super().__init__()
185
+ self.head_dim, self.base, self.dtype = head_dim, base, dtype
186
+ self.initial_context_length = initial_context_length
187
+ self.scaling_factor, self.ntk_alpha, self.ntk_beta = scaling_factor, ntk_alpha, ntk_beta
188
+ self.device = device
189
+ mp = max(int(initial_context_length * scaling_factor), initial_context_length)
190
+ self.max_position_embeddings = mp
191
+ cos, sin = self._compute(mp, device=torch.device("cpu"))
192
+ target = device or torch.device("cpu")
193
+ self.register_buffer("cos_cache", cos.to(target), persistent=False)
194
+ self.register_buffer("sin_cache", sin.to(target), persistent=False)
195
+
196
+ def _inv_freq(self, device=None):
197
+ device = device or self.device
198
+ freq = self.base ** (torch.arange(0, self.head_dim, 2, dtype=torch.float, device=device) / self.head_dim)
199
+ if self.scaling_factor > 1.0:
200
+ d_half = self.head_dim / 2
201
+ low = d_half * math.log(self.initial_context_length / (self.ntk_beta * 2 * math.pi)) / math.log(self.base)
202
+ high = d_half * math.log(self.initial_context_length / (self.ntk_alpha * 2 * math.pi)) / math.log(self.base)
203
+ interp = 1.0 / (self.scaling_factor * freq)
204
+ extrap = 1.0 / freq
205
+ ramp = (torch.arange(d_half, dtype=torch.float32, device=device) - low) / (high - low)
206
+ mask = 1 - ramp.clamp(0, 1)
207
+ return interp * (1 - mask) + extrap * mask
208
+ return 1.0 / freq
209
+
210
+ def _compute(self, n, device=None):
211
+ inv_freq = self._inv_freq(device)
212
+ t = torch.arange(n, dtype=torch.float32, device=device or self.device)
213
+ freqs = torch.einsum("i,j->ij", t, inv_freq)
214
+ c = 0.1 * math.log(self.scaling_factor) + 1.0 if self.scaling_factor > 1.0 else 1.0
215
+ return (freqs.cos() * c).to(self.dtype), (freqs.sin() * c).to(self.dtype)
216
+
217
+ def forward(self, q, k):
218
+ n = q.shape[0]
219
+ if n > self.cos_cache.shape[0]:
220
+ cos, sin = self._compute(n, torch.device("cpu"))
221
+ self.cos_cache, self.sin_cache = cos.to(q.device), sin.to(q.device)
222
+ cc = self.cos_cache.to(q.device) if self.cos_cache.device != q.device else self.cos_cache
223
+ sc = self.sin_cache.to(q.device) if self.sin_cache.device != q.device else self.sin_cache
224
+ cos, sin = cc[:n], sc[:n]
225
+ q = apply_rope(q.view(n, -1, self.head_dim), cos, sin).reshape(q.shape)
226
+ k = apply_rope(k.view(n, -1, self.head_dim), cos, sin).reshape(k.shape)
227
+ return q, k
228
+
229
+
230
+ def sdpa(Q, K, V, S, sm_scale, ctx):
231
+ n, nh, qm, hd = Q.shape
232
+ w = 2 * ctx + 1
233
+ Kp = F.pad(K, (0, 0, 0, 0, ctx, ctx)); Vp = F.pad(V, (0, 0, 0, 0, ctx, ctx))
234
+ Kw = Kp.unfold(0, w, 1).permute(0, 3, 1, 2); Vw = Vp.unfold(0, w, 1).permute(0, 3, 1, 2)
235
+ idx = torch.arange(w, device=Q.device) - ctx
236
+ pos = torch.arange(n, device=Q.device)[:, None] + idx[None, :]
237
+ valid = (pos >= 0) & (pos < n)
238
+ scores = torch.einsum("nhqd,nwhd->nhqw", Q, Kw).float() * sm_scale
239
+ scores = scores.masked_fill(~valid[:, None, None, :], -float("inf"))
240
+ sink = (S * math.log(2.0)).reshape(nh, qm)[None, :, :, None].expand(n, -1, -1, 1)
241
+ scores = torch.cat([scores, sink], dim=-1)
242
+ wt = torch.softmax(scores, dim=-1)[..., :-1].to(V.dtype)
243
+ return torch.einsum("nhqw,nwhd->nhqd", wt, Vw).reshape(n, -1)
244
+
245
+
246
+ class AttentionBlock(torch.nn.Module):
247
+ def __init__(self, cfg: ModelConfig, device=None):
248
+ super().__init__()
249
+ dt = torch.bfloat16
250
+ self.head_dim, self.nah, self.nkv = cfg.head_dim, cfg.num_attention_heads, cfg.num_key_value_heads
251
+ self.ctx = int(cfg.bidirectional_context_size)
252
+ self.sinks = torch.nn.Parameter(torch.empty(cfg.num_attention_heads, device=device, dtype=torch.float32))
253
+ self.norm = RMSNorm(cfg.hidden_size, device=device)
254
+ qkv_d = cfg.head_dim * (cfg.num_attention_heads + 2 * cfg.num_key_value_heads)
255
+ self.qkv = torch.nn.Linear(cfg.hidden_size, qkv_d, device=device, dtype=dt)
256
+ self.out = torch.nn.Linear(cfg.head_dim * cfg.num_attention_heads, cfg.hidden_size, device=device, dtype=dt)
257
+ self.qk_scale = 1 / math.sqrt(math.sqrt(cfg.head_dim))
258
+ self.rope = RotaryEmbedding(cfg.head_dim, int(cfg.rope_theta), torch.float32,
259
+ initial_context_length=cfg.initial_context_length,
260
+ scaling_factor=cfg.rope_scaling_factor,
261
+ ntk_alpha=cfg.rope_ntk_alpha, ntk_beta=cfg.rope_ntk_beta, device=device)
262
+
263
+ def forward(self, x):
264
+ t = self.norm(x).to(self.qkv.weight.dtype)
265
+ qkv = F.linear(t, self.qkv.weight, self.qkv.bias)
266
+ hd, nah, nkv = self.head_dim, self.nah, self.nkv
267
+ q = qkv[:, :nah * hd].contiguous()
268
+ k = qkv[:, nah * hd:(nah + nkv) * hd].contiguous()
269
+ v = qkv[:, (nah + nkv) * hd:(nah + 2 * nkv) * hd].contiguous()
270
+ q, k = self.rope(q, k)
271
+ q, k = q * self.qk_scale, k * self.qk_scale
272
+ n = q.shape[0]
273
+ q = q.view(n, nkv, nah // nkv, hd); k = k.view(n, nkv, hd); v = v.view(n, nkv, hd)
274
+ ao = sdpa(q, k, v, self.sinks, 1.0, self.ctx).to(self.out.weight.dtype)
275
+ return x + F.linear(ao, self.out.weight, self.out.bias).to(x.dtype)
276
+
277
+
278
+ def swiglu(x, alpha=1.702, limit=7.0):
279
+ g, l = x.chunk(2, dim=-1)
280
+ g, l = g.clamp(max=limit), l.clamp(-limit, limit)
281
+ return g * torch.sigmoid(alpha * g) * (l + 1)
282
+
283
+
284
+ class MLPBlock(torch.nn.Module):
285
+ def __init__(self, cfg: ModelConfig, device=None):
286
+ super().__init__()
287
+ dt = torch.bfloat16
288
+ self.ne, self.ept = cfg.num_experts, cfg.experts_per_token
289
+ self.norm = RMSNorm(cfg.hidden_size, device=device)
290
+ self.gate = torch.nn.Linear(cfg.hidden_size, cfg.num_experts, device=device, dtype=dt)
291
+ self.mlp1_weight = torch.nn.Parameter(torch.empty(cfg.num_experts, cfg.hidden_size, cfg.intermediate_size * 2, device=device, dtype=dt))
292
+ self.mlp1_bias = torch.nn.Parameter(torch.empty(cfg.num_experts, cfg.intermediate_size * 2, device=device, dtype=dt))
293
+ self.mlp2_weight = torch.nn.Parameter(torch.empty(cfg.num_experts, cfg.intermediate_size, cfg.hidden_size, device=device, dtype=dt))
294
+ self.mlp2_bias = torch.nn.Parameter(torch.empty(cfg.num_experts, cfg.hidden_size, device=device, dtype=dt))
295
+
296
+ def forward(self, x):
297
+ t = self.norm(x)
298
+ gs = F.linear(t.float(), self.gate.weight.float(), self.gate.bias.float())
299
+ top = torch.topk(gs, k=self.ept, dim=-1, sorted=True)
300
+ ew = torch.softmax(top.values, dim=-1) / self.ept
301
+ ei = top.indices
302
+ ept = self.ept
303
+
304
+ def _chunk(tc, eic, ewc):
305
+ o = expert_linear(tc.float().unsqueeze(1).expand(-1, eic.shape[1], -1),
306
+ self.mlp1_weight[eic].float(), self.mlp1_bias[eic].float())
307
+ o = swiglu(o)
308
+ o = expert_linear(o.float(), self.mlp2_weight[eic].float(), self.mlp2_bias[eic].float())
309
+ return (torch.einsum("bec,be->bc", o.to(ewc.dtype), ewc) * ept).to(x.dtype)
310
+
311
+ cs = 32
312
+ if t.shape[0] > cs:
313
+ parts = [_chunk(t[s:s+cs], ei[s:s+cs], ew[s:s+cs]) for s in range(0, t.shape[0], cs)]
314
+ return x + torch.cat(parts, 0)
315
+ return x + _chunk(t, ei, ew)
316
+
317
+
318
+ class TransformerBlock(torch.nn.Module):
319
+ def __init__(self, cfg, device=None):
320
+ super().__init__()
321
+ self.attn = AttentionBlock(cfg, device=device)
322
+ self.mlp = MLPBlock(cfg, device=device)
323
+ def forward(self, x):
324
+ return self.mlp(self.attn(x))
325
+
326
+
327
+ class Checkpoint:
328
+ @staticmethod
329
+ def build_param_name_map(n):
330
+ return ({f"block.{i}.mlp.mlp1_bias": f"block.{i}.mlp.swiglu.bias" for i in range(n)}
331
+ | {f"block.{i}.mlp.mlp1_weight": f"block.{i}.mlp.swiglu.weight" for i in range(n)}
332
+ | {f"block.{i}.mlp.mlp2_bias": f"block.{i}.mlp.out.bias" for i in range(n)}
333
+ | {f"block.{i}.mlp.mlp2_weight": f"block.{i}.mlp.out.weight" for i in range(n)})
334
+
335
+ def __init__(self, path, device, num_hidden_layers):
336
+ self.pnm = self.build_param_name_map(num_hidden_layers)
337
+ self.ds = device.type if device.index is None else f"{device.type}:{device.index}"
338
+ files = [os.path.join(path, f) for f in os.listdir(path) if f.endswith(".safetensors")]
339
+ self.map = {}
340
+ for sf in files:
341
+ with safe_open(sf, framework="pt", device=self.ds) as h:
342
+ for k in h.keys():
343
+ self.map[k] = sf
344
+
345
+ def get(self, name):
346
+ mapped = self.pnm.get(name, name)
347
+ with safe_open(self.map[mapped], framework="pt", device=self.ds) as h:
348
+ return h.get_tensor(mapped)
349
+
350
+
351
+ class Transformer(torch.nn.Module):
352
+ def __init__(self, cfg, device):
353
+ super().__init__()
354
+ dt = torch.bfloat16
355
+ self.embedding = torch.nn.Embedding(cfg.vocab_size, cfg.hidden_size, device=device, dtype=dt)
356
+ self.block = torch.nn.ModuleList([TransformerBlock(cfg, device=device) for _ in range(cfg.num_hidden_layers)])
357
+ self.norm = RMSNorm(cfg.hidden_size, device=device)
358
+ self.unembedding = torch.nn.Linear(cfg.hidden_size, cfg.num_labels, bias=False, device=device, dtype=dt)
359
+
360
+ def forward(self, token_ids):
361
+ x = self.embedding(token_ids)
362
+ for blk in self.block:
363
+ x = blk(x)
364
+ return F.linear(self.norm(x), self.unembedding.weight, None)
365
+
366
+ @classmethod
367
+ def from_checkpoint(cls, checkpoint_dir, *, device):
368
+ torch.backends.cuda.matmul.allow_tf32 = False
369
+ torch.backends.cudnn.allow_tf32 = False
370
+ torch.set_float32_matmul_precision("highest")
371
+ cp = json.loads((Path(checkpoint_dir) / "config.json").read_text())
372
+ validate_model_config_contract(cp, context=str(checkpoint_dir))
373
+ cfg = ModelConfig.from_checkpoint_config(cp, context=str(checkpoint_dir))
374
+ ckpt = Checkpoint(checkpoint_dir, device, cfg.num_hidden_layers)
375
+ m = cls(cfg, device); m.eval()
376
+ for name, param in m.named_parameters():
377
+ loaded = ckpt.get(name)
378
+ if param.shape != loaded.shape:
379
+ raise ValueError(f"Shape mismatch {name}: {param.shape} vs {loaded.shape}")
380
+ param.data.copy_(loaded)
381
+ return m
382
+
383
+
384
+ # ── label info + span decoding ───────────────────────────────────
385
+
386
+ @dataclass(frozen=True)
387
+ class LabelInfo:
388
+ boundary_label_lookup: dict[str, dict[str, int]]
389
+ token_to_span_label: dict[int, int]
390
+ token_boundary_tags: dict[int, str | None]
391
+ span_class_names: tuple[str, ...]
392
+ span_label_lookup: dict[str, int]
393
+ background_token_label: int
394
+ background_span_label: int
395
+
396
+
397
+ def labels_to_spans(labels_by_index, label_info):
398
+ spans, cur_label, start_idx, prev_idx = [], None, None, None
399
+ bg = label_info.background_span_label
400
+ for ti in sorted(labels_by_index):
401
+ lid = labels_by_index[ti]
402
+ sl = label_info.token_to_span_label.get(lid)
403
+ bt = label_info.token_boundary_tags.get(lid)
404
+ if prev_idx is not None and ti != prev_idx + 1:
405
+ if cur_label is not None and start_idx is not None:
406
+ spans.append((cur_label, start_idx, prev_idx + 1))
407
+ cur_label = start_idx = None
408
+ if sl is None:
409
+ prev_idx = ti; continue
410
+ if sl == bg:
411
+ if cur_label is not None and start_idx is not None:
412
+ spans.append((cur_label, start_idx, ti))
413
+ cur_label = start_idx = None; prev_idx = ti; continue
414
+ if bt == "S":
415
+ if cur_label is not None and start_idx is not None and prev_idx is not None:
416
+ spans.append((cur_label, start_idx, prev_idx + 1))
417
+ spans.append((sl, ti, ti + 1)); cur_label = start_idx = None
418
+ elif bt == "B":
419
+ if cur_label is not None and start_idx is not None and prev_idx is not None:
420
+ spans.append((cur_label, start_idx, prev_idx + 1))
421
+ cur_label, start_idx = sl, ti
422
+ elif bt == "I":
423
+ if cur_label is None or cur_label != sl:
424
+ if cur_label is not None and start_idx is not None and prev_idx is not None:
425
+ spans.append((cur_label, start_idx, prev_idx + 1))
426
+ cur_label, start_idx = sl, ti
427
+ elif bt == "E":
428
+ if cur_label is None or cur_label != sl or start_idx is None:
429
+ if cur_label is not None and start_idx is not None and prev_idx is not None:
430
+ spans.append((cur_label, start_idx, prev_idx + 1))
431
+ spans.append((sl, ti, ti + 1)); cur_label = start_idx = None
432
+ else:
433
+ spans.append((cur_label, start_idx, ti + 1)); cur_label = start_idx = None
434
+ else:
435
+ if cur_label is not None and start_idx is not None and prev_idx is not None:
436
+ spans.append((cur_label, start_idx, prev_idx + 1))
437
+ cur_label = start_idx = None
438
+ prev_idx = ti
439
+ if cur_label is not None and start_idx is not None and prev_idx is not None:
440
+ spans.append((cur_label, start_idx, prev_idx + 1))
441
+ return spans
442
+
443
+
444
+ def token_spans_to_char_spans(spans, cs, ce):
445
+ out = []
446
+ for li, ts, te in spans:
447
+ if not (0 <= ts < te <= len(cs)):
448
+ continue
449
+ s, e = cs[ts], ce[te - 1]
450
+ if e > s:
451
+ out.append((li, s, e))
452
+ return out
453
+
454
+
455
+ def trim_char_spans_whitespace(spans, text):
456
+ out = []
457
+ for li, s, e in spans:
458
+ if not (0 <= s < e <= len(text)):
459
+ continue
460
+ while s < e and text[s].isspace(): s += 1
461
+ while e > s and text[e - 1].isspace(): e -= 1
462
+ if e > s:
463
+ out.append((li, s, e))
464
+ return out
465
+
466
+
467
+ # ── viterbi decoder ──────────────────────────────────────────────
468
+
469
+ @functools.lru_cache(maxsize=1)
470
+ def get_viterbi_transition_biases():
471
+ cp = MODEL_DIR / "viterbi_calibration.json"
472
+ default = {k: 0.0 for k in VITERBI_TRANSITION_BIAS_KEYS}
473
+ if not cp.is_file():
474
+ return default
475
+ payload = json.loads(cp.read_text())
476
+ raw = payload
477
+ ops = payload.get("operating_points")
478
+ if isinstance(ops, dict):
479
+ preset = ops.get(DEFAULT_VITERBI_CALIBRATION_PRESET)
480
+ if isinstance(preset, dict):
481
+ raw = preset.get("biases", raw)
482
+ if not isinstance(raw, dict):
483
+ return default
484
+ return {k: float(raw.get(k, 0.0)) for k in VITERBI_TRANSITION_BIAS_KEYS}
485
+
486
+
487
+ class Decoder:
488
+ def __init__(self, label_info):
489
+ nc = len(label_info.token_to_span_label)
490
+ self._start = torch.full((nc,), -1e9, dtype=torch.float32)
491
+ self._end = torch.full((nc,), -1e9, dtype=torch.float32)
492
+ self._trans = torch.full((nc, nc), -1e9, dtype=torch.float32)
493
+ biases = get_viterbi_transition_biases()
494
+ bg_tok, bg_sp = label_info.background_token_label, label_info.background_span_label
495
+ ttsl, tbt = label_info.token_to_span_label, label_info.token_boundary_tags
496
+ for i in range(nc):
497
+ tag, sl = tbt.get(i), ttsl.get(i)
498
+ if tag in {"B", "S"} or i == bg_tok: self._start[i] = 0.0
499
+ if tag in {"E", "S"} or i == bg_tok: self._end[i] = 0.0
500
+ for j in range(nc):
501
+ nt, ns = tbt.get(j), ttsl.get(j)
502
+ if self._valid(tag, sl, nt, ns, bg_tok, bg_sp, j):
503
+ self._trans[i, j] = self._bias(tag, sl, nt, ns, bg_sp, biases)
504
+
505
+ @staticmethod
506
+ def _valid(pt, ps, nt, ns, bti, bsi, ni):
507
+ nb = ns == bsi or ni == bti
508
+ if (ns is None or nt is None) and not nb: return False
509
+ if pt is None or ps is None: return nb or nt in {"B", "S"}
510
+ if ps == bsi or pt in {"E", "S"}: return nb or nt in {"B", "S"}
511
+ if pt in {"B", "I"}: return ps == ns and nt in {"I", "E"}
512
+ return False
513
+
514
+ @staticmethod
515
+ def _bias(pt, ps, nt, ns, bsi, b):
516
+ nb, pb = ns == bsi, ps == bsi
517
+ if pb: return b["transition_bias_background_stay"] if nb else b["transition_bias_background_to_start"]
518
+ if pt in {"B", "I"}: return b["transition_bias_inside_to_continue"] if nt == "I" else b["transition_bias_inside_to_end"]
519
+ return b["transition_bias_end_to_background"] if nb else b["transition_bias_end_to_start"]
520
+
521
+ def decode(self, lp):
522
+ # Sequential Viterbi over a tiny (33-class) state space. On T4 the
523
+ # per-step CUDA kernel launches dominated runtime for 100k+ tokens,
524
+ # so run on CPU unconditionally β€” it's bandwidth-bound on 33x33 and
525
+ # avoids one CUDA sync per timestep.
526
+ if lp.is_cuda:
527
+ lp = lp.to("cpu", dtype=torch.float32, non_blocking=True)
528
+ else:
529
+ lp = lp.to(dtype=torch.float32)
530
+ sl, nc = lp.shape
531
+ if sl == 0: return []
532
+ st, en, tr = self._start, self._end, self._trans # already CPU/fp32
533
+ scores = lp[0] + st
534
+ bp = torch.empty((sl - 1, nc), dtype=torch.int64)
535
+ for i in range(1, sl):
536
+ t = scores.unsqueeze(1) + tr
537
+ bs, bi = t.max(dim=0)
538
+ scores = bs + lp[i]; bp[i - 1] = bi
539
+ if not torch.isfinite(scores).any(): return lp.argmax(dim=1).tolist()
540
+ scores = scores + en
541
+ path = torch.empty(sl, dtype=torch.int64)
542
+ path[-1] = scores.argmax()
543
+ for i in range(sl - 2, -1, -1): path[i] = bp[i, path[i + 1]]
544
+ return path.tolist()
545
+
546
+
547
+ # ── runtime singleton ────────────────────────────────────────────
548
+
549
+ @dataclass(frozen=True)
550
+ class InferenceRuntime:
551
+ model: Transformer; encoding: tiktoken.Encoding; label_info: LabelInfo
552
+ device: torch.device; n_ctx: int
553
+
554
+
555
+ @functools.lru_cache(maxsize=1)
556
+ def get_runtime():
557
+ cp = MODEL_DIR
558
+ cfg = json.loads((cp / "config.json").read_text())
559
+ validate_model_config_contract(cfg, context=str(cp))
560
+ device = torch.device("cuda")
561
+ encoding = tiktoken.get_encoding(str(cfg["encoding"]).strip())
562
+ scn = [BACKGROUND_CLASS_LABEL]; sll = {BACKGROUND_CLASS_LABEL: 0}
563
+ bll, ttsl, tbt = {}, {}, {}
564
+ bg_idx = None
565
+ for idx, name in enumerate(NER_CLASS_NAMES):
566
+ if name == BACKGROUND_CLASS_LABEL:
567
+ bg_idx = idx; ttsl[idx] = 0; tbt[idx] = None; continue
568
+ bnd, base = name.split("-", 1)
569
+ si = sll.get(base)
570
+ if si is None:
571
+ si = len(scn); scn.append(base); sll[base] = si
572
+ ttsl[idx] = si; tbt[idx] = bnd
573
+ bll.setdefault(base, {})[bnd] = idx
574
+ li = LabelInfo(bll, ttsl, tbt, tuple(scn), sll, bg_idx, 0)
575
+ m = Transformer.from_checkpoint(str(cp), device=device)
576
+ return InferenceRuntime(m, encoding, li, device, int(cfg["default_n_ctx"]))
577
+
578
+
579
+ @functools.lru_cache(maxsize=1)
580
+ def get_decoder():
581
+ return Decoder(label_info=get_runtime().label_info)
582
+
583
+
584
+ @torch.inference_mode()
585
+ def predict_text(runtime, text, decoder):
586
+ tids = tuple(int(t) for t in runtime.encoding.encode(text, allowed_special="all"))
587
+ if not tids: return text, []
588
+ # Run the model per-chunk and concat once. The v4 code built a Python
589
+ # list via `.unbind(0)` and then rebuilt the same tensor via stack β€” a
590
+ # no-op that paid 100k small allocations on long inputs.
591
+ chunks = []
592
+ for s in range(0, len(tids), runtime.n_ctx):
593
+ e = min(s + runtime.n_ctx, len(tids))
594
+ wt = torch.tensor(tids[s:e], device=runtime.device, dtype=torch.int32)
595
+ lp = F.log_softmax(runtime.model(wt).float(), dim=-1)
596
+ chunks.append(lp)
597
+ stacked = chunks[0] if len(chunks) == 1 else torch.cat(chunks, dim=0)
598
+ dl = decoder.decode(stacked) # Decoder pulls to CPU internally
599
+ if len(dl) != len(tids): dl = stacked.argmax(dim=1).tolist()
600
+ pli = {i: int(l) for i, l in enumerate(dl)}
601
+ pts = labels_to_spans(pli, runtime.label_info)
602
+ tb = [runtime.encoding.decode_single_token_bytes(t) for t in tids]
603
+ dt = b"".join(tb).decode("utf-8", errors="replace")
604
+ cbs, cbe = [], []
605
+ bc = 0
606
+ for ch in dt: cbs.append(bc); bc += len(ch.encode("utf-8")); cbe.append(bc)
607
+ cs, ce = [], []
608
+ tbc = 0
609
+ for rb in tb:
610
+ tbs = tbc; tbe = tbs + len(rb); tbc = tbe
611
+ cs.append(bisect_right(cbe, tbs)); ce.append(bisect_left(cbs, tbe))
612
+ pcs = token_spans_to_char_spans(pts, cs, ce)
613
+ pcs = trim_char_spans_whitespace(pcs, dt if dt != text else text)
614
+ src = dt if dt != text else text
615
+ detected = []
616
+ for li, s, e in pcs:
617
+ if 0 <= li < len(runtime.label_info.span_class_names):
618
+ lbl = runtime.label_info.span_class_names[li]
619
+ else:
620
+ lbl = f"label_{li}"
621
+ detected.append({"label": lbl, "start": s, "end": e, "text": src[s:e]})
622
+ return src, detected
623
+
624
+
625
+ # =====================================================================
626
+ # APPLICATION LAYER
627
+ # =====================================================================
628
+
629
+ def extract_text(file_path: str) -> str:
630
+ suffix = Path(file_path).suffix.lower()
631
+ if suffix == ".pdf":
632
+ import fitz
633
+ doc = fitz.open(file_path)
634
+ pages = [page.get_text() for page in doc]
635
+ doc.close()
636
+ return "\n\n".join(pages)
637
+ elif suffix in (".docx", ".doc"):
638
+ from docx import Document
639
+ doc = Document(file_path)
640
+ return "\n\n".join(p.text for p in doc.paragraphs if p.text.strip())
641
+ raise ValueError(f"Unsupported file type: {suffix}")
642
+
643
+
644
+ def compute_stats(text, spans):
645
+ total = len(text)
646
+ pii_chars = sum(s["end"] - s["start"] for s in spans)
647
+ by_cat = {}
648
+ for s in spans:
649
+ c = s["label"]
650
+ by_cat.setdefault(c, {"count": 0, "chars": 0})
651
+ by_cat[c]["count"] += 1; by_cat[c]["chars"] += s["end"] - s["start"]
652
+ return {
653
+ "total_chars": total, "pii_chars": pii_chars,
654
+ "pii_percentage": round(pii_chars / total * 100, 1) if total else 0,
655
+ "total_spans": len(spans), "categories": by_cat, "num_categories": len(by_cat),
656
+ "total_lines": text.count("\n") + 1 if total else 0,
657
+ }
658
+
659
+
660
+ def detect_speakers(text, spans):
661
+ patterns = [r"^([A-Z][a-zA-Z ]{1,30}):\s", r"^\[([^\]]{1,30})\]\s", r"^(Speaker\s*\d+):\s"]
662
+ line_sp, pos, cur = [], 0, None
663
+ for line in text.split("\n"):
664
+ for p in patterns:
665
+ m = re.match(p, line)
666
+ if m: cur = m.group(1).strip(); break
667
+ line_sp.append((pos, pos + len(line), cur)); pos += len(line) + 1
668
+ result = {}
669
+ for span in spans:
670
+ mid = (span["start"] + span["end"]) // 2
671
+ speaker = "Document"
672
+ for ls, le, sp in line_sp:
673
+ if ls <= mid <= le and sp: speaker = sp; break
674
+ result[speaker] = result.get(speaker, 0) + 1
675
+ return {} if list(result.keys()) == ["Document"] else result
676
+
677
+
678
+ @spaces.GPU
679
+ def run_pii_analysis(text: str):
680
+ """GPU-accelerated PII detection."""
681
+ runtime = get_runtime()
682
+ decoder = get_decoder() # cached, not rebuilt per request
683
+ source_text, detected = predict_text(runtime, text, decoder)
684
+ return source_text, detected
685
+
686
+
687
+ def build_redacted_pdf_bytes(pdf_path: str, pii_texts: list[str]) -> bytes:
688
+ """
689
+ True PyMuPDF redaction: draws a black fill rectangle over the target
690
+ text AND removes the underlying text stream. Longer strings are
691
+ redacted first so fuller matches win over their substrings.
692
+ """
693
+ import fitz
694
+ # Longest first: "Dr. Margaret Holloway" before "Margaret"
695
+ ordered = sorted({t.strip() for t in pii_texts if t and len(t.strip()) >= 2},
696
+ key=len, reverse=True)
697
+ doc = fitz.open(pdf_path)
698
+ try:
699
+ for page in doc:
700
+ for needle in ordered:
701
+ for rect in page.search_for(needle):
702
+ page.add_redact_annot(rect, fill=(0, 0, 0))
703
+ page.apply_redactions()
704
+ buf = io.BytesIO()
705
+ doc.save(buf, garbage=4, deflate=True)
706
+ return buf.getvalue()
707
+ finally:
708
+ doc.close()
709
+
710
+
711
+ # ── Gradio Server ────────────────────────────────────────────────
712
+ server = gr.Server()
713
+
714
+
715
+ @server.get("/", response_class=HTMLResponse)
716
+ async def homepage():
717
+ return FRONTEND_HTML
718
+
719
+
720
+ @server.post("/api/analyze")
721
+ async def analyze_document(file: UploadFile = File(...)):
722
+ suffix = Path(file.filename).suffix.lower()
723
+ if suffix not in (".pdf", ".doc", ".docx"):
724
+ return JSONResponse({"error": f"Unsupported: {suffix}. Use PDF, DOC, or DOCX."}, 400)
725
+ with tempfile.NamedTemporaryFile(delete=False, suffix=suffix) as tmp:
726
+ tmp.write(await file.read()); tmp_path = tmp.name
727
+ try:
728
+ text = extract_text(tmp_path)
729
+ if not text.strip():
730
+ return JSONResponse({"error": "No text content found."}, 400)
731
+ source_text, spans = run_pii_analysis(text)
732
+ stats = compute_stats(source_text, spans)
733
+ speakers = detect_speakers(source_text, spans)
734
+ return JSONResponse({
735
+ "filename": file.filename, "text": source_text, "spans": spans,
736
+ "stats": stats, "speakers": speakers,
737
+ "categories_meta": {k: {"color": v["color"], "cls": v["cls"],
738
+ "label": v["label"], "mono": v["mono"]}
739
+ for k, v in CATEGORIES_META.items()},
740
+ })
741
+ except Exception as e:
742
+ return JSONResponse({"error": str(e)}, 500)
743
+ finally:
744
+ if os.path.exists(tmp_path): os.unlink(tmp_path)
745
+
746
+
747
+ @server.post("/api/redact-pdf")
748
+ async def redact_pdf_endpoint(
749
+ file: UploadFile = File(...),
750
+ spans: str = Form(...),
751
+ active: str = Form(...),
752
+ ):
753
+ suffix = Path(file.filename).suffix.lower()
754
+ if suffix != ".pdf":
755
+ return JSONResponse({"error": "PDF redaction only accepts PDF input."}, 400)
756
+ try:
757
+ span_list = json.loads(spans)
758
+ active_set = set(json.loads(active))
759
+ except Exception as e:
760
+ return JSONResponse({"error": f"Invalid payload: {e}"}, 400)
761
+
762
+ pii_texts = [
763
+ s.get("text", "") for s in span_list
764
+ if s.get("label") in active_set
765
+ ]
766
+ if not pii_texts:
767
+ return JSONResponse({"error": "No active categories selected β€” nothing to redact."}, 400)
768
+
769
+ with tempfile.NamedTemporaryFile(delete=False, suffix=suffix) as tmp:
770
+ tmp.write(await file.read()); tmp_path = tmp.name
771
+ try:
772
+ pdf_bytes = build_redacted_pdf_bytes(tmp_path, pii_texts)
773
+ out_name = (Path(file.filename).stem or "document") + ".redacted.pdf"
774
+ return StreamingResponse(
775
+ io.BytesIO(pdf_bytes),
776
+ media_type="application/pdf",
777
+ headers={"Content-Disposition": f'attachment; filename="{out_name}"'},
778
+ )
779
+ except Exception as e:
780
+ return JSONResponse({"error": str(e)}, 500)
781
+ finally:
782
+ if os.path.exists(tmp_path): os.unlink(tmp_path)
783
+
784
+
785
+ @server.api(name="analyze_text")
786
+ def analyze_text_api(text: str) -> str:
787
+ """Gradio API: analyze raw text for PII."""
788
+ source_text, spans = run_pii_analysis(text)
789
+ stats = compute_stats(source_text, spans)
790
+ return json.dumps({"text": source_text, "spans": spans, "stats": stats}, ensure_ascii=False)
791
+
792
+
793
+ # ── Frontend HTML (v5) ───────────────────────────────────────────
794
+ FRONTEND_HTML = r"""<!DOCTYPE html>
795
+ <html lang="en">
796
+ <head>
797
+ <meta charset="UTF-8">
798
+ <meta name="viewport" content="width=device-width,initial-scale=1">
799
+ <title>PII Reveal β€” Inspector</title>
800
+ <link rel="preconnect" href="https://fonts.googleapis.com">
801
+ <link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
802
+ <link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&family=JetBrains+Mono:wght@400;500&family=Source+Serif+4:opsz,wght@8..60,400;8..60,500;8..60,600&display=swap" rel="stylesheet">
803
+ <style>
804
+ /* =========================================================
805
+ Theme tokens β€” light refresh: brighter cards, darker copy,
806
+ cleaner borders. Dark tokens unchanged from v4.
807
+ ========================================================= */
808
+ :root{
809
+ /* Light */
810
+ --body-background-fill: #f6f6f7;
811
+ --block-background-fill: #ffffff;
812
+ --block-background-fill-2: #f1f1f3;
813
+ --body-text-color: #0a0a0a;
814
+ --body-text-color-subdued: #3f3f46;
815
+ --body-text-color-faint: #6b7280;
816
+ --border-color-primary: #e4e4e7;
817
+ --border-color-accent: #d4d4d8;
818
+ --primary-bg: #18181b;
819
+ --primary-fg: #ffffff;
820
+
821
+ --h-alpha: 16%;
822
+ --shadow-xs: 0 1px 1.5px rgba(10,10,10,.04);
823
+ --shadow-sm: 0 1px 3px rgba(10,10,10,.06), 0 1px 2px rgba(10,10,10,.04);
824
+ --shadow-md: 0 4px 14px rgba(10,10,10,.07), 0 1px 3px rgba(10,10,10,.04);
825
+
826
+ --border-radius-lg: 10px;
827
+ --border-radius-md: 6px;
828
+ --border-radius-sm: 4px;
829
+
830
+ --font-sans: 'Inter', system-ui, -apple-system, 'Segoe UI', sans-serif;
831
+ --font-mono: 'JetBrains Mono', ui-monospace, SFMono-Regular, Menlo, Consolas, monospace;
832
+ --font-serif: 'Source Serif 4', 'Source Serif Pro', 'Iowan Old Style', Georgia, serif;
833
+ }
834
+
835
+ @media (prefers-color-scheme: dark){
836
+ :root{
837
+ --body-background-fill: #0e0e11;
838
+ --block-background-fill: #18181c;
839
+ --block-background-fill-2: #1f1f24;
840
+ --body-text-color: #e8e8ea;
841
+ --body-text-color-subdued: #a8a8ae;
842
+ --body-text-color-faint: #70707a;
843
+ --border-color-primary: rgba(255,255,255,0.08);
844
+ --border-color-accent: rgba(255,255,255,0.18);
845
+ --primary-bg: #f0f0f2;
846
+ --primary-fg: #0e0e11;
847
+ --h-alpha: 15%;
848
+ --shadow-xs: none;
849
+ --shadow-sm: none;
850
+ --shadow-md: none;
851
+ }
852
+ }
853
+
854
+ .dark, .dark :root, html.dark, body.dark{
855
+ --body-background-fill: #0e0e11;
856
+ --block-background-fill: #18181c;
857
+ --block-background-fill-2: #1f1f24;
858
+ --body-text-color: #e8e8ea;
859
+ --body-text-color-subdued: #a8a8ae;
860
+ --body-text-color-faint: #70707a;
861
+ --border-color-primary: rgba(255,255,255,0.08);
862
+ --border-color-accent: rgba(255,255,255,0.18);
863
+ --primary-bg: #f0f0f2;
864
+ --primary-fg: #0e0e11;
865
+ --h-alpha: 15%;
866
+ --shadow-xs: none;
867
+ --shadow-sm: none;
868
+ --shadow-md: none;
869
+ }
870
+
871
+ *,*::before,*::after{box-sizing:border-box;margin:0;padding:0}
872
+ html,body{height:100%}
873
+ body{
874
+ font-family:var(--font-sans);
875
+ background:var(--body-background-fill);
876
+ color:var(--body-text-color);
877
+ font-size:13.5px;line-height:1.5;
878
+ -webkit-font-smoothing:antialiased;
879
+ font-feature-settings:"cv11","ss01";
880
+ }
881
+ button{font:inherit;color:inherit;background:transparent;border:0;cursor:pointer}
882
+ .sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);white-space:nowrap;border:0}
883
+
884
+ /* shared small-caps label treatment */
885
+ .caps{
886
+ font-size:11px;font-weight:500;
887
+ letter-spacing:0.06em;text-transform:uppercase;
888
+ color:var(--body-text-color-subdued);
889
+ }
890
+
891
+ .shell{max-width:1080px;margin:0 auto;padding:40px 16px 48px}
892
+
893
+ /* =========================================================
894
+ Upload / landing view
895
+ ========================================================= */
896
+ #upload-view{min-height:100vh;display:flex;align-items:center;justify-content:center}
897
+ #upload-view .shell{width:100%}
898
+
899
+ .u-card{
900
+ display:grid;grid-template-columns:1.05fr 0.95fr;gap:0;
901
+ background:var(--block-background-fill);
902
+ border:0.5px solid var(--border-color-primary);
903
+ border-radius:var(--border-radius-lg);
904
+ overflow:hidden;
905
+ box-shadow:var(--shadow-md);
906
+ }
907
+ .u-left{padding:40px 36px 34px}
908
+ .u-right{
909
+ padding:40px 36px 34px;
910
+ background:var(--block-background-fill-2);
911
+ border-left:0.5px solid var(--border-color-primary);
912
+ display:flex;flex-direction:column;gap:14px;
913
+ }
914
+ .u-brand{display:flex;align-items:center;gap:10px;margin-bottom:24px}
915
+ .u-brand svg{color:var(--body-text-color)}
916
+ .u-brand-name{font-size:13.5px;font-weight:500}
917
+ .u-brand-name .sub{color:var(--body-text-color-faint);font-weight:400;margin-left:4px}
918
+ .u-title{
919
+ font-family:var(--font-serif);
920
+ font-size:30px;font-weight:500;letter-spacing:-0.018em;
921
+ line-height:1.15;margin-bottom:10px;
922
+ color:var(--body-text-color);
923
+ }
924
+ .u-sub{color:var(--body-text-color-subdued);font-size:14px;margin-bottom:20px;max-width:42ch}
925
+
926
+ .u-chips{display:flex;flex-wrap:wrap;gap:6px 12px;margin-bottom:24px}
927
+ .u-chip{
928
+ display:inline-flex;align-items:center;gap:6px;
929
+ font-size:12px;color:var(--body-text-color-subdued);font-weight:500;
930
+ }
931
+ .u-chip-dot{width:7px;height:7px;border-radius:2px}
932
+
933
+ .u-drop{
934
+ border:1px solid var(--border-color-primary);
935
+ background:color-mix(in srgb, var(--body-text-color) 2.5%, transparent);
936
+ border-radius:var(--border-radius-md);
937
+ padding:30px 20px;
938
+ cursor:pointer;text-align:center;
939
+ transition:background .15s,border-color .15s;
940
+ position:relative;
941
+ }
942
+ .u-drop:hover,.u-drop.dragover{
943
+ background:color-mix(in srgb, var(--body-text-color) 5%, transparent);
944
+ border-color:var(--border-color-accent);
945
+ }
946
+ .u-drop-icon{margin:0 auto 8px;color:var(--body-text-color-subdued)}
947
+ .u-drop-title{font-size:13.5px;font-weight:500;margin-bottom:3px;color:var(--body-text-color)}
948
+ .u-drop-sub{font-family:var(--font-mono);font-size:11px;color:var(--body-text-color-faint)}
949
+ .u-drop input{position:absolute;inset:0;opacity:0;cursor:pointer}
950
+
951
+ .u-meta{
952
+ display:flex;flex-wrap:wrap;align-items:center;margin-top:22px;
953
+ font-family:var(--font-mono);font-size:11px;color:var(--body-text-color-faint);
954
+ }
955
+ .u-meta > span{padding:0 12px;border-right:1px solid var(--border-color-primary);line-height:1}
956
+ .u-meta > span:first-child{padding-left:0}
957
+ .u-meta > span:last-child{border-right:0;padding-right:0}
958
+
959
+ .prev-h{margin-bottom:8px}
960
+ .prev-row{display:grid;grid-template-columns:1fr 16px 1fr;gap:10px;align-items:stretch}
961
+ .prev-arrow{align-self:center;color:var(--body-text-color-faint);font-family:var(--font-mono);font-size:12px;text-align:center}
962
+ .prev-card{
963
+ background:var(--block-background-fill);
964
+ border:0.5px solid var(--border-color-primary);
965
+ border-radius:var(--border-radius-md);
966
+ padding:14px 14px 12px;
967
+ font-family:var(--font-serif);
968
+ font-size:12.5px;line-height:1.65;
969
+ color:var(--body-text-color);
970
+ min-height:148px;
971
+ box-shadow:var(--shadow-xs);
972
+ }
973
+ .prev-label{
974
+ font-family:var(--font-sans);font-size:10px;font-weight:500;
975
+ letter-spacing:0.08em;text-transform:uppercase;
976
+ color:var(--body-text-color-faint);
977
+ display:block;margin-bottom:8px;
978
+ }
979
+ .prev-card p{margin:0 0 6px}
980
+ .prev-card p:last-child{margin-bottom:0}
981
+ .prev-bar{
982
+ display:inline-block;vertical-align:middle;
983
+ height:0.85em;border-radius:2px;
984
+ background:var(--body-text-color);opacity:.88;margin:0 1px;
985
+ }
986
+ .u-stat{
987
+ margin-top:auto;padding-top:14px;border-top:0.5px solid var(--border-color-primary);
988
+ display:flex;align-items:baseline;gap:8px;
989
+ color:var(--body-text-color-subdued);font-size:12px;
990
+ }
991
+ .u-stat b{
992
+ font-family:var(--font-serif);font-weight:500;font-size:18px;
993
+ color:var(--body-text-color);letter-spacing:-0.01em;
994
+ }
995
+
996
+ /* =========================================================
997
+ Results / inspector view
998
+ ========================================================= */
999
+ #results-view{display:none;min-height:100vh}
1000
+ .pr-app{
1001
+ font-family:var(--font-sans);
1002
+ border:0.5px solid var(--border-color-primary);
1003
+ border-radius:var(--border-radius-lg);
1004
+ overflow:hidden;
1005
+ background:var(--block-background-fill);
1006
+ color:var(--body-text-color);
1007
+ box-shadow:var(--shadow-md);
1008
+ }
1009
+
1010
+ /* ── top bar ── */
1011
+ .pr-top{
1012
+ display:flex;align-items:center;gap:10px;flex-wrap:wrap;
1013
+ padding:11px 14px;
1014
+ border-bottom:0.5px solid var(--border-color-primary);
1015
+ }
1016
+ .pr-logo{display:flex;align-items:center;gap:8px}
1017
+ .pr-name{font-size:13.5px;font-weight:500}
1018
+ .pr-name-sub{color:var(--body-text-color-faint);font-weight:400;margin-left:4px}
1019
+ .pr-file-chip{
1020
+ font-family:var(--font-mono);font-size:11.5px;
1021
+ color:var(--body-text-color-subdued);
1022
+ padding:4px 8px;
1023
+ background:var(--block-background-fill-2);
1024
+ border:0.5px solid var(--border-color-primary);
1025
+ border-radius:5px;margin-left:4px;
1026
+ max-width:220px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap;
1027
+ }
1028
+ .pr-grow{flex:1}
1029
+ .pr-status{font-size:11.5px;color:var(--body-text-color-subdued);display:flex;align-items:center;gap:6px}
1030
+ .pr-status-dot{width:6px;height:6px;border-radius:50%;background:#1D9E75;box-shadow:0 0 0 3px color-mix(in srgb, #1D9E75 18%, transparent)}
1031
+
1032
+ .pr-top-actions{display:flex;align-items:center;gap:6px;flex-wrap:wrap}
1033
+ .pr-btn{
1034
+ font-size:12px;padding:6px 10px;
1035
+ border:0.5px solid var(--border-color-accent);
1036
+ border-radius:5px;
1037
+ background:var(--block-background-fill);
1038
+ color:var(--body-text-color);
1039
+ cursor:pointer;
1040
+ font-family:inherit;font-weight:500;
1041
+ display:inline-flex;align-items:center;gap:6px;
1042
+ transition:background .12s,border-color .12s,color .12s;
1043
+ }
1044
+ .pr-btn:hover:not(:disabled){background:color-mix(in srgb, var(--body-text-color) 4%, var(--block-background-fill));border-color:var(--body-text-color-subdued)}
1045
+ .pr-btn:disabled{opacity:.5;cursor:not-allowed}
1046
+ .pr-btn-ghost{border-color:var(--border-color-primary);color:var(--body-text-color-subdued);background:transparent;font-weight:400}
1047
+ .pr-btn-ghost:hover:not(:disabled){color:var(--body-text-color);border-color:var(--border-color-accent);background:color-mix(in srgb, var(--body-text-color) 3%, transparent)}
1048
+ .pr-btn-prim{
1049
+ background:var(--primary-bg);color:var(--primary-fg);
1050
+ border-color:var(--primary-bg);font-weight:500;
1051
+ }
1052
+ .pr-btn-prim:hover:not(:disabled){background:color-mix(in srgb, var(--primary-bg) 88%, var(--body-text-color));border-color:var(--primary-bg)}
1053
+ .pr-btn-arr{font-family:var(--font-mono);font-size:11px;opacity:0.6}
1054
+
1055
+ /* ── stats ── */
1056
+ .pr-stats{padding:18px 18px 16px;border-bottom:0.5px solid var(--border-color-primary)}
1057
+ .pr-stats-row{display:flex;align-items:flex-end;gap:34px;margin-bottom:14px;flex-wrap:wrap}
1058
+ .pr-hero{
1059
+ font-size:34px;font-weight:600;line-height:1;letter-spacing:-0.028em;
1060
+ font-variant-numeric:tabular-nums;color:var(--body-text-color);
1061
+ }
1062
+ .pr-hero-pct{font-size:18px;opacity:0.5;margin-left:1px;font-weight:400}
1063
+ .pr-num{font-size:21px;font-weight:600;line-height:1;letter-spacing:-0.015em;font-variant-numeric:tabular-nums}
1064
+ .pr-lab{margin-top:10px}
1065
+
1066
+ .pr-bar{display:flex;height:4px;gap:2px;margin-bottom:12px;border-radius:2px;overflow:hidden}
1067
+ .pr-bar > span{display:block;height:100%;border-radius:1px;min-width:4px;transition:opacity .15s}
1068
+ .pr-bar > span:hover{opacity:.82}
1069
+
1070
+ .pr-legend{display:flex;flex-wrap:wrap;gap:8px 14px;font-size:12px}
1071
+ .pr-leg{display:flex;align-items:center;gap:6px;color:var(--body-text-color-subdued);cursor:pointer;user-select:none;font-weight:500}
1072
+ .pr-leg-sw{width:8px;height:8px;border-radius:2px}
1073
+ .pr-leg-ct{font-family:var(--font-mono);font-size:11px;color:var(--body-text-color-faint);margin-left:1px;font-weight:500}
1074
+ .pr-leg.off{opacity:.4}
1075
+ .pr-leg.off .pr-leg-sw{opacity:.3}
1076
+
1077
+ /* ── body ── */
1078
+ .pr-body{display:grid;grid-template-columns:minmax(0,1fr) 220px}
1079
+
1080
+ /* ── doc pane ── */
1081
+ .pr-doc-pane{
1082
+ padding:20px 24px 28px;
1083
+ border-right:0.5px solid var(--border-color-primary);
1084
+ min-width:0;max-height:calc(100vh - 260px);overflow-y:auto;
1085
+ }
1086
+ .pr-doc-meta{
1087
+ font-family:var(--font-mono);font-size:11px;color:var(--body-text-color-faint);
1088
+ margin-bottom:16px;display:flex;gap:10px;flex-wrap:wrap;
1089
+ }
1090
+ .pr-doc-meta span + span::before{content:'Β·';margin-right:10px;color:var(--border-color-accent)}
1091
+
1092
+ .pr-text{
1093
+ font-family:var(--font-serif);
1094
+ font-size:15px;line-height:1.9;
1095
+ color:var(--body-text-color);
1096
+ white-space:pre-wrap;word-wrap:break-word;
1097
+ font-feature-settings:"liga","calt";
1098
+ }
1099
+
1100
+ .h{padding:1px 1px;border-bottom:1.5px solid;transition:background .15s,opacity .15s;cursor:pointer}
1101
+ .h:hover{filter:brightness(0.96) saturate(1.12)}
1102
+ .h.off{background:transparent !important;border-color:transparent !important;color:inherit;opacity:.9}
1103
+ .hp {background:color-mix(in srgb, #E24B4A var(--h-alpha), transparent); border-color:#E24B4A}
1104
+ .hd {background:color-mix(in srgb, #7F77DD var(--h-alpha), transparent); border-color:#7F77DD}
1105
+ .ha {background:color-mix(in srgb, #1D9E75 var(--h-alpha), transparent); border-color:#1D9E75}
1106
+ .he {background:color-mix(in srgb, #378ADD var(--h-alpha), transparent); border-color:#378ADD}
1107
+ .hac {background:color-mix(in srgb, #BA7517 var(--h-alpha), transparent); border-color:#BA7517}
1108
+ .hu {background:color-mix(in srgb, #D85A30 var(--h-alpha), transparent); border-color:#D85A30}
1109
+ .hs {background:color-mix(in srgb, #D4537E var(--h-alpha), transparent); border-color:#D4537E}
1110
+ .hph {background:color-mix(in srgb, #639922 var(--h-alpha), transparent); border-color:#639922}
1111
+ .m{font-family:var(--font-mono);font-size:13px}
1112
+
1113
+ /* ── sidebar ── */
1114
+ .pr-side{background:var(--block-background-fill-2);padding:16px 14px;display:flex;flex-direction:column;gap:20px;min-width:0}
1115
+ .pr-side-head{display:flex;align-items:baseline;justify-content:space-between;gap:8px;margin-bottom:8px}
1116
+ .pr-side-link{font-size:11px;color:var(--body-text-color-subdued);cursor:pointer;background:transparent;border:0;padding:0;font-family:inherit;font-weight:500}
1117
+ .pr-side-link:hover{color:var(--body-text-color);text-decoration:underline}
1118
+
1119
+ .pr-cat{
1120
+ position:relative;
1121
+ display:grid;grid-template-columns:9px 1fr auto;
1122
+ column-gap:8px;row-gap:4px;align-items:center;
1123
+ padding:8px 10px 7px;
1124
+ border-radius:var(--border-radius-sm);
1125
+ background:color-mix(in srgb, var(--body-text-color) 3%, transparent);
1126
+ border:0.5px solid transparent;
1127
+ cursor:pointer;user-select:none;
1128
+ transition:background .12s,border-color .12s,opacity .15s;
1129
+ margin-bottom:4px;overflow:hidden;
1130
+ }
1131
+ .pr-cat:hover{border-color:var(--border-color-accent)}
1132
+ .pr-cat-sw{width:9px;height:9px;border-radius:2px;flex-shrink:0;grid-row:1}
1133
+ .pr-cat-nm{grid-row:1;color:var(--body-text-color);font-size:12.5px;font-weight:500}
1134
+ .pr-cat-ct{grid-row:1;font-family:var(--font-mono);font-size:11px;color:var(--body-text-color-faint);text-align:right;font-weight:500}
1135
+ .pr-cat-mini{grid-column:2/4;grid-row:2;height:1.5px;width:100%;background:color-mix(in srgb, var(--body-text-color) 6%, transparent);border-radius:1px;overflow:hidden}
1136
+ .pr-cat-mini > span{display:block;height:100%;border-radius:1px;transition:width .2s,background .15s}
1137
+ .pr-cat.on{background:color-mix(in srgb, var(--cat) 9%, transparent);box-shadow:inset 3px 0 0 0 var(--cat);padding-left:13px}
1138
+ .pr-cat.on .pr-cat-nm{color:var(--body-text-color)}
1139
+ .pr-cat.off{opacity:.42;filter:saturate(.35)}
1140
+ .pr-cat.off .pr-cat-nm{text-decoration:line-through}
1141
+ .pr-cat.off .pr-cat-mini > span{background:var(--body-text-color-faint) !important}
1142
+
1143
+ .pr-speakers .pr-cat{cursor:default;background:transparent;border-color:transparent;padding:4px 2px}
1144
+ .pr-speakers .pr-cat:hover{background:transparent;border-color:transparent}
1145
+ .pr-speakers .pr-cat-sw{background:var(--body-text-color-faint);opacity:.55}
1146
+ .pr-speakers .pr-cat-mini{display:none}
1147
+
1148
+ .empty-rail{color:var(--body-text-color-faint);font-size:12px;font-style:italic}
1149
+
1150
+ /* loading */
1151
+ #loading{
1152
+ position:fixed;inset:0;
1153
+ background:color-mix(in srgb, var(--body-background-fill) 88%, transparent);
1154
+ backdrop-filter:blur(8px);
1155
+ display:none;flex-direction:column;align-items:center;justify-content:center;gap:10px;z-index:9999;
1156
+ }
1157
+ .l-ring{width:26px;height:26px;border:1.5px solid var(--border-color-accent);border-top-color:var(--body-text-color);border-radius:50%;animation:sp .7s linear infinite}
1158
+ @keyframes sp{to{transform:rotate(360deg)}}
1159
+ .l-label{font-family:var(--font-mono);font-size:11.5px;color:var(--body-text-color-subdued)}
1160
+
1161
+ .error-banner{
1162
+ margin:14px 18px 0;padding:10px 14px;
1163
+ background:color-mix(in srgb, #E24B4A 10%, transparent);
1164
+ border:0.5px solid color-mix(in srgb, #E24B4A 45%, transparent);
1165
+ border-radius:var(--border-radius-md);
1166
+ color:#C43A39;font-size:12.5px;display:none;font-weight:500;
1167
+ }
1168
+
1169
+ .tip{
1170
+ position:fixed;z-index:9998;
1171
+ font-family:var(--font-mono);font-size:11px;
1172
+ color:var(--primary-fg);background:var(--primary-bg);
1173
+ padding:4px 8px;border-radius:4px;
1174
+ pointer-events:none;white-space:nowrap;
1175
+ max-width:420px;overflow:hidden;text-overflow:ellipsis;
1176
+ }
1177
+
1178
+ @media(max-width:880px){
1179
+ .u-card{grid-template-columns:1fr}
1180
+ .u-right{border-left:0;border-top:0.5px solid var(--border-color-primary)}
1181
+ .pr-body{grid-template-columns:1fr}
1182
+ .pr-doc-pane{border-right:none;border-bottom:0.5px solid var(--border-color-primary);max-height:none}
1183
+ }
1184
+ @media(max-width:640px){
1185
+ .shell{padding:24px 12px}
1186
+ }
1187
+ </style>
1188
+ </head>
1189
+ <body>
1190
+
1191
+ <!-- ============ UPLOAD VIEW ============ -->
1192
+ <div id="upload-view">
1193
+ <div class="shell">
1194
+ <div class="u-card">
1195
+ <div class="u-left">
1196
+ <div class="u-brand">
1197
+ <svg width="20" height="20" viewBox="0 0 20 20" fill="none">
1198
+ <rect x="0" y="0" width="20" height="20" rx="5" fill="currentColor"/>
1199
+ <circle cx="8.5" cy="8.5" r="3.2" stroke="var(--block-background-fill)" stroke-width="1.4" fill="none"/>
1200
+ <line x1="11.2" y1="11.2" x2="14.2" y2="14.2" stroke="var(--block-background-fill)" stroke-width="1.4" stroke-linecap="round"/>
1201
+ </svg>
1202
+ <span class="u-brand-name">PII Reveal<span class="sub">/ inspector</span></span>
1203
+ </div>
1204
+ <h1 class="u-title">See what your documents are leaking.</h1>
1205
+ <p class="u-sub">Find every PII span in a PDF, DOC or DOCX β€” names, accounts, secrets and five other entity types β€” then export a fully redacted copy.</p>
1206
+
1207
+ <div class="u-chips">
1208
+ <span class="u-chip"><span class="u-chip-dot" style="background:#E24B4A"></span>Person</span>
1209
+ <span class="u-chip"><span class="u-chip-dot" style="background:#378ADD"></span>Email</span>
1210
+ <span class="u-chip"><span class="u-chip-dot" style="background:#7F77DD"></span>Date</span>
1211
+ <span class="u-chip"><span class="u-chip-dot" style="background:#1D9E75"></span>Address</span>
1212
+ <span class="u-chip"><span class="u-chip-dot" style="background:#BA7517"></span>Account</span>
1213
+ <span class="u-chip"><span class="u-chip-dot" style="background:#D85A30"></span>URL</span>
1214
+ <span class="u-chip"><span class="u-chip-dot" style="background:#639922"></span>Phone</span>
1215
+ <span class="u-chip"><span class="u-chip-dot" style="background:#D4537E"></span>Secret</span>
1216
+ </div>
1217
+
1218
+ <div class="u-drop" id="dropzone">
1219
+ <div class="u-drop-icon">
1220
+ <svg width="22" height="22" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round">
1221
+ <path d="M12 3v13"/><path d="m6 9 6-6 6 6"/><path d="M4 17v2a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2v-2"/>
1222
+ </svg>
1223
+ </div>
1224
+ <div class="u-drop-title">Drop a document, or click to browse</div>
1225
+ <div class="u-drop-sub">pdf &middot; doc &middot; docx &middot; up to 128k tokens</div>
1226
+ <input type="file" id="file-input" accept=".pdf,.doc,.docx">
1227
+ </div>
1228
+
1229
+ <div class="u-meta">
1230
+ <span>openai privacy filter</span>
1231
+ <span>128k ctx</span>
1232
+ <span>bfloat16</span>
1233
+ <span>apache 2.0</span>
1234
+ </div>
1235
+ </div>
1236
+
1237
+ <div class="u-right" aria-hidden="true">
1238
+ <div class="prev-h caps">Before &rarr; after</div>
1239
+ <div class="prev-row">
1240
+ <div class="prev-card">
1241
+ <span class="prev-label">detected</span>
1242
+ <p>Reporter: <span class="h hp">Dr. Margaret Holloway-Chen</span> called at <span class="h hd m">03:42 GMT</span>.</p>
1243
+ <p>Email: <span class="h he m">margaret.h@protomail.co.uk</span>.</p>
1244
+ <p>Token: <span class="h hs m">sk_live_T3sT4zN9pQ2v</span>.</p>
1245
+ </div>
1246
+ <div class="prev-arrow">&rarr;</div>
1247
+ <div class="prev-card">
1248
+ <span class="prev-label">redacted</span>
1249
+ <p>Reporter: <span class="prev-bar" style="width:11em"></span> called at <span class="prev-bar" style="width:3.5em"></span>.</p>
1250
+ <p>Email: <span class="prev-bar" style="width:9em"></span>.</p>
1251
+ <p>Token: <span class="prev-bar" style="width:7em"></span>.</p>
1252
+ </div>
1253
+ </div>
1254
+ <div class="u-stat">
1255
+ <b>PDF-ready</b>
1256
+ <span>export a redacted PDF or .txt with one click</span>
1257
+ </div>
1258
+ </div>
1259
+ </div>
1260
+ </div>
1261
+ </div>
1262
+
1263
+ <!-- ============ RESULTS VIEW ============ -->
1264
+ <div id="results-view">
1265
+ <div class="shell">
1266
+ <div class="pr-app" aria-label="PII Reveal inspector">
1267
+
1268
+ <div class="pr-top">
1269
+ <div class="pr-logo">
1270
+ <svg width="20" height="20" viewBox="0 0 20 20" fill="none" style="color: var(--body-text-color);">
1271
+ <rect x="0" y="0" width="20" height="20" rx="5" fill="currentColor"/>
1272
+ <circle cx="8.5" cy="8.5" r="3.2" stroke="var(--block-background-fill)" stroke-width="1.4" fill="none"/>
1273
+ <line x1="11.2" y1="11.2" x2="14.2" y2="14.2" stroke="var(--block-background-fill)" stroke-width="1.4" stroke-linecap="round"/>
1274
+ </svg>
1275
+ <span class="pr-name">PII Reveal<span class="pr-name-sub">/ inspector</span></span>
1276
+ </div>
1277
+ <span class="pr-file-chip" id="file-chip"></span>
1278
+ <span class="pr-status" id="scan-status"><span class="pr-status-dot"></span>Scan complete</span>
1279
+ <div class="pr-grow"></div>
1280
+ <div class="pr-top-actions">
1281
+ <button class="pr-btn pr-btn-ghost" id="act-copy" title="Copy masked text to clipboard"><span>Copy masked</span></button>
1282
+ <button class="pr-btn pr-btn-ghost" id="act-report" title="Download JSON report"><span>Report</span></button>
1283
+ <button class="pr-btn" id="act-txt" title="Download sanitized .txt"><span>.txt</span></button>
1284
+ <button class="pr-btn pr-btn-prim" id="act-pdf" title="Download redacted PDF"><span>Redact PDF</span><span class="pr-btn-arr">&rarr;</span></button>
1285
+ <button class="pr-btn pr-btn-ghost" id="btn-new"><span>New file</span></button>
1286
+ </div>
1287
+ </div>
1288
+
1289
+ <div class="error-banner" id="error-banner"></div>
1290
+
1291
+ <div class="pr-stats">
1292
+ <div class="pr-stats-row">
1293
+ <div>
1294
+ <div class="pr-hero"><span id="hero-val">0</span><span class="pr-hero-pct">%</span></div>
1295
+ <div class="caps pr-lab">PII content</div>
1296
+ </div>
1297
+ <div>
1298
+ <div class="pr-num" id="num-spans">0</div>
1299
+ <div class="caps pr-lab">Spans detected</div>
1300
+ </div>
1301
+ <div>
1302
+ <div class="pr-num" id="num-cats">0 / 8</div>
1303
+ <div class="caps pr-lab">Categories present</div>
1304
+ </div>
1305
+ <div>
1306
+ <div class="pr-num" id="num-speakers">0</div>
1307
+ <div class="caps pr-lab">Speakers identified</div>
1308
+ </div>
1309
+ </div>
1310
+
1311
+ <div class="pr-bar" id="dist-bar"></div>
1312
+ <div class="pr-legend" id="legend"></div>
1313
+ </div>
1314
+
1315
+ <div class="pr-body">
1316
+ <div class="pr-doc-pane">
1317
+ <div class="pr-doc-meta" id="doc-meta"></div>
1318
+ <div class="pr-text" id="doc-text"></div>
1319
+ </div>
1320
+
1321
+ <aside class="pr-side">
1322
+ <div>
1323
+ <div class="pr-side-head">
1324
+ <span class="caps">Filter categories</span>
1325
+ <button class="pr-side-link" id="cat-toggle-all">Clear all</button>
1326
+ </div>
1327
+ <div id="cat-list"></div>
1328
+ </div>
1329
+ <div id="speakers-block" style="display:none">
1330
+ <div class="pr-side-head"><span class="caps">Speakers</span></div>
1331
+ <div class="pr-speakers" id="speakers-list"></div>
1332
+ </div>
1333
+ </aside>
1334
+ </div>
1335
+ </div>
1336
+ </div>
1337
+ </div>
1338
+
1339
+ <div id="loading">
1340
+ <div class="l-ring"></div>
1341
+ <div class="l-label" id="loading-label">scanning document&hellip;</div>
1342
+ </div>
1343
+
1344
+ <div class="tip" id="tip" style="display:none"></div>
1345
+
1346
+ <script>
1347
+ /* ===== state ===== */
1348
+ const S = {
1349
+ text:'', spans:[], stats:{}, speakers:{}, catMeta:{}, filename:'', file:null,
1350
+ activeCats:new Set(), scanMs:0, sortedSpans:[],
1351
+ };
1352
+
1353
+ const DEFAULT_META = {
1354
+ private_person: {color:'#E24B4A', cls:'hp', label:'Person', mono:false},
1355
+ private_date: {color:'#7F77DD', cls:'hd', label:'Date', mono:true},
1356
+ private_address: {color:'#1D9E75', cls:'ha', label:'Address', mono:false},
1357
+ private_email: {color:'#378ADD', cls:'he', label:'Email', mono:true},
1358
+ account_number: {color:'#BA7517', cls:'hac', label:'Account', mono:true},
1359
+ private_url: {color:'#D85A30', cls:'hu', label:'URL', mono:true},
1360
+ secret: {color:'#D4537E', cls:'hs', label:'Secret', mono:true},
1361
+ private_phone: {color:'#639922', cls:'hph', label:'Phone', mono:true},
1362
+ };
1363
+ const ORDER = ['private_person','private_address','private_email','private_phone',
1364
+ 'private_url','private_date','account_number','secret'];
1365
+
1366
+ const metaFor = c => ({...(DEFAULT_META[c]||{color:'#999',cls:'',label:c,mono:false}), ...(S.catMeta[c]||{})});
1367
+ const isPdf = () => (S.filename||'').toLowerCase().endsWith('.pdf');
1368
+
1369
+ /* ===== upload flow ===== */
1370
+ const dz = document.getElementById('dropzone');
1371
+ const fi = document.getElementById('file-input');
1372
+ ['dragenter','dragover'].forEach(e => dz.addEventListener(e, ev => { ev.preventDefault(); dz.classList.add('dragover'); }));
1373
+ ['dragleave','drop'].forEach(e => dz.addEventListener(e, ev => { ev.preventDefault(); dz.classList.remove('dragover'); }));
1374
+ dz.addEventListener('drop', ev => { if (ev.dataTransfer.files[0]) uploadFile(ev.dataTransfer.files[0]); });
1375
+ fi.addEventListener('change', ev => { if (ev.target.files[0]) uploadFile(ev.target.files[0]); });
1376
+
1377
+ async function uploadFile(file){
1378
+ const ext = file.name.split('.').pop().toLowerCase();
1379
+ if (!['pdf','doc','docx'].includes(ext)) { showError('Unsupported file type.'); return; }
1380
+ S.file = file; // keep for redact-pdf round-trip
1381
+ document.getElementById('loading-label').textContent = 'scanning document…';
1382
+ document.getElementById('loading').style.display='flex';
1383
+ document.getElementById('upload-view').style.display='none';
1384
+ const form = new FormData(); form.append('file', file);
1385
+ const t0 = performance.now();
1386
+ try{
1387
+ const r = await fetch('/api/analyze', {method:'POST', body:form});
1388
+ const d = await r.json();
1389
+ if (d.error) { showError(d.error); return; }
1390
+ S.scanMs = performance.now() - t0;
1391
+ S.text = d.text; S.spans = d.spans; S.stats = d.stats;
1392
+ S.speakers = d.speakers||{}; S.catMeta = d.categories_meta||{};
1393
+ S.filename = d.filename;
1394
+ S.activeCats = new Set(Object.keys(d.stats.categories));
1395
+ S.sortedSpans = [...S.spans].sort((a,b) => a.start - b.start);
1396
+ renderResults();
1397
+ } catch(e){ showError('Analysis failed: '+e.message); }
1398
+ finally { document.getElementById('loading').style.display='none'; }
1399
+ }
1400
+
1401
+ function showError(m){
1402
+ document.getElementById('loading').style.display='none';
1403
+ document.getElementById('upload-view').style.display='flex';
1404
+ document.getElementById('results-view').style.display='none';
1405
+ alert(m);
1406
+ }
1407
+
1408
+ function resetView(){
1409
+ document.getElementById('results-view').style.display='none';
1410
+ document.getElementById('upload-view').style.display='flex';
1411
+ fi.value = ''; S.file = null;
1412
+ }
1413
+ document.getElementById('btn-new').addEventListener('click', resetView);
1414
+
1415
+ /* ===== render ===== */
1416
+ function renderResults(){
1417
+ document.getElementById('results-view').style.display='block';
1418
+ document.getElementById('file-chip').textContent = S.filename;
1419
+ document.getElementById('scan-status').innerHTML =
1420
+ `<span class="pr-status-dot"></span>Scan complete &middot; ${(S.scanMs/1000).toFixed(1)}s`;
1421
+ renderStats();
1422
+ renderBar();
1423
+ renderLegend();
1424
+ renderDocMeta();
1425
+ renderDoc();
1426
+ renderCats();
1427
+ renderSpeakers();
1428
+ updateToggleAllLabel();
1429
+ updatePrimaryAction();
1430
+ }
1431
+
1432
+ function updatePrimaryAction(){
1433
+ // If the input is a PDF, "Redact PDF" is primary; otherwise hide it and
1434
+ // promote the .txt export to primary.
1435
+ const pdfBtn = document.getElementById('act-pdf');
1436
+ const txtBtn = document.getElementById('act-txt');
1437
+ if (isPdf()) {
1438
+ pdfBtn.style.display = '';
1439
+ pdfBtn.classList.add('pr-btn-prim');
1440
+ txtBtn.classList.remove('pr-btn-prim');
1441
+ } else {
1442
+ pdfBtn.style.display = 'none';
1443
+ pdfBtn.classList.remove('pr-btn-prim');
1444
+ txtBtn.classList.add('pr-btn-prim');
1445
+ }
1446
+ }
1447
+
1448
+ function renderStats(){
1449
+ const s = S.stats;
1450
+ document.getElementById('hero-val').textContent = (s.pii_percentage ?? 0).toFixed(1);
1451
+ document.getElementById('num-spans').textContent = s.total_spans;
1452
+ document.getElementById('num-cats').textContent = `${s.num_categories} / 8`;
1453
+ const n = Object.keys(S.speakers).length;
1454
+ document.getElementById('num-speakers').textContent = n || 'β€”';
1455
+ }
1456
+
1457
+ function renderBar(){
1458
+ const bar = document.getElementById('dist-bar');
1459
+ bar.innerHTML = '';
1460
+ const cats = S.stats.categories;
1461
+ const total = Object.values(cats).reduce((a,b) => a + b.chars, 0) || 1;
1462
+ const ordered = ORDER.filter(c => cats[c]);
1463
+ if (!ordered.length) {
1464
+ const span = document.createElement('span');
1465
+ span.style.cssText = 'flex:1;background:var(--border-color-primary);opacity:.4';
1466
+ bar.appendChild(span); return;
1467
+ }
1468
+ for (const c of ordered) {
1469
+ const m = metaFor(c);
1470
+ const span = document.createElement('span');
1471
+ span.style.background = m.color;
1472
+ span.style.flex = cats[c].chars / total;
1473
+ span.dataset.cat = c;
1474
+ span.addEventListener('mouseenter', ev => showTip(ev, `${m.label} Β· ${cats[c].count}`));
1475
+ span.addEventListener('mousemove', moveTip);
1476
+ span.addEventListener('mouseleave', hideTip);
1477
+ if (!S.activeCats.has(c)) span.style.opacity = '.25';
1478
+ bar.appendChild(span);
1479
+ }
1480
+ }
1481
+
1482
+ function renderLegend(){
1483
+ const leg = document.getElementById('legend');
1484
+ leg.innerHTML = '';
1485
+ const cats = S.stats.categories;
1486
+ const ordered = ORDER.filter(c => cats[c]);
1487
+ for (const c of ordered) {
1488
+ const m = metaFor(c);
1489
+ const el = document.createElement('span');
1490
+ el.className = 'pr-leg' + (S.activeCats.has(c) ? '' : ' off');
1491
+ el.dataset.cat = c;
1492
+ el.innerHTML = `<span class="pr-leg-sw" style="background:${m.color}"></span>${m.label}<span class="pr-leg-ct">${cats[c].count}</span>`;
1493
+ el.addEventListener('click', () => toggleCat(c));
1494
+ leg.appendChild(el);
1495
+ }
1496
+ }
1497
+
1498
+ function renderDocMeta(){
1499
+ const s = S.stats;
1500
+ const meta = document.getElementById('doc-meta');
1501
+ const parts = [
1502
+ `${s.total_chars.toLocaleString()} characters`,
1503
+ `${s.total_lines.toLocaleString()} lines`,
1504
+ `scanned in ${(S.scanMs/1000).toFixed(1)}s`,
1505
+ ];
1506
+ meta.innerHTML = parts.map(p => `<span>${p}</span>`).join('');
1507
+ }
1508
+
1509
+ function esc(s){ const d=document.createElement('div'); d.textContent=s; return d.innerHTML; }
1510
+
1511
+ function renderDoc(){
1512
+ const { text, sortedSpans, activeCats } = S;
1513
+ const el = document.getElementById('doc-text');
1514
+ let html = '', pos = 0;
1515
+ for (const sp of sortedSpans) {
1516
+ if (sp.start < pos) continue;
1517
+ if (sp.start > pos) html += esc(text.substring(pos, sp.start));
1518
+ const m = metaFor(sp.label);
1519
+ const cls = ['h', m.cls];
1520
+ if (m.mono) cls.push('m');
1521
+ if (!activeCats.has(sp.label)) cls.push('off');
1522
+ html += `<span class="${cls.join(' ')}" data-cat="${sp.label}">${esc(text.substring(sp.start, sp.end))}</span>`;
1523
+ pos = sp.end;
1524
+ }
1525
+ if (pos < text.length) html += esc(text.substring(pos));
1526
+ el.innerHTML = html;
1527
+
1528
+ el.querySelectorAll('.h').forEach(span => {
1529
+ const cat = span.dataset.cat, m = metaFor(cat);
1530
+ span.addEventListener('mouseenter', ev => showTip(ev, `${m.label}: ${span.textContent.trim()}`));
1531
+ span.addEventListener('mousemove', moveTip);
1532
+ span.addEventListener('mouseleave', hideTip);
1533
+ });
1534
+ }
1535
+
1536
+ function renderCats(){
1537
+ const box = document.getElementById('cat-list');
1538
+ box.innerHTML = '';
1539
+ const cats = S.stats.categories;
1540
+ const ordered = ORDER.filter(c => cats[c]);
1541
+ if (!ordered.length) { box.innerHTML = '<div class="empty-rail">No entities detected.</div>'; return; }
1542
+ const totalSpans = S.stats.total_spans || 1;
1543
+ for (const c of ordered) {
1544
+ const m = metaFor(c);
1545
+ const count = cats[c].count;
1546
+ const share = (count / totalSpans) * 100;
1547
+ const active = S.activeCats.has(c);
1548
+ const el = document.createElement('div');
1549
+ el.className = 'pr-cat' + (active ? ' on' : ' off');
1550
+ el.dataset.cat = c;
1551
+ el.style.setProperty('--cat', m.color);
1552
+ el.innerHTML = `
1553
+ <span class="pr-cat-sw" style="background:${m.color}"></span>
1554
+ <span class="pr-cat-nm">${m.label}</span>
1555
+ <span class="pr-cat-ct">${count}</span>
1556
+ <span class="pr-cat-mini"><span style="width:${share.toFixed(1)}%;background:${m.color}"></span></span>`;
1557
+ el.addEventListener('click', () => toggleCat(c));
1558
+ box.appendChild(el);
1559
+ }
1560
+ }
1561
+
1562
+ function renderSpeakers(){
1563
+ const names = Object.keys(S.speakers);
1564
+ const block = document.getElementById('speakers-block');
1565
+ const box = document.getElementById('speakers-list');
1566
+ if (!names.length) { block.style.display = 'none'; return; }
1567
+ block.style.display = 'block';
1568
+ box.innerHTML = '';
1569
+ for (const n of names) {
1570
+ const el = document.createElement('div');
1571
+ el.className = 'pr-cat';
1572
+ el.innerHTML = `<span class="pr-cat-sw"></span><span class="pr-cat-nm">${esc(n)}</span><span class="pr-cat-ct">${S.speakers[n]}</span>`;
1573
+ box.appendChild(el);
1574
+ }
1575
+ }
1576
+
1577
+ function toggleCat(c){
1578
+ if (S.activeCats.has(c)) S.activeCats.delete(c);
1579
+ else S.activeCats.add(c);
1580
+ const on = S.activeCats.has(c);
1581
+ document.querySelectorAll(`.pr-cat[data-cat="${c}"]`).forEach(el => { el.classList.toggle('on', on); el.classList.toggle('off', !on); });
1582
+ document.querySelectorAll(`.pr-leg[data-cat="${c}"]`).forEach(el => el.classList.toggle('off', !on));
1583
+ document.querySelectorAll(`.h[data-cat="${c}"]`).forEach(el => el.classList.toggle('off', !on));
1584
+ document.querySelectorAll(`.pr-bar span[data-cat="${c}"]`).forEach(el => el.style.opacity = on ? '1' : '.25');
1585
+ updateToggleAllLabel();
1586
+ }
1587
+
1588
+ function updateToggleAllLabel(){
1589
+ const btn = document.getElementById('cat-toggle-all');
1590
+ if (!btn) return;
1591
+ const all = Object.keys(S.stats.categories||{});
1592
+ const allOn = all.length > 0 && all.every(c => S.activeCats.has(c));
1593
+ btn.textContent = allOn ? 'Clear all' : 'Select all';
1594
+ }
1595
+ document.getElementById('cat-toggle-all').addEventListener('click', () => {
1596
+ const all = Object.keys(S.stats.categories||{});
1597
+ const allOn = all.every(c => S.activeCats.has(c));
1598
+ all.forEach(c => {
1599
+ const want = !allOn;
1600
+ if (want !== S.activeCats.has(c)) toggleCat(c);
1601
+ });
1602
+ });
1603
+
1604
+ /* tooltip */
1605
+ function showTip(ev, text){ const t = document.getElementById('tip'); t.textContent = text; t.style.display = 'block'; moveTip(ev); }
1606
+ function moveTip(ev){ const t = document.getElementById('tip'); t.style.left = (ev.clientX + 12) + 'px'; t.style.top = (ev.clientY - 26) + 'px'; }
1607
+ function hideTip(){ document.getElementById('tip').style.display = 'none'; }
1608
+
1609
+ /* ===== actions ===== */
1610
+ function sanitizedText(){
1611
+ const parts = []; let pos = 0;
1612
+ for (const sp of S.sortedSpans) {
1613
+ if (sp.start < pos) continue;
1614
+ if (sp.start > pos) parts.push(S.text.substring(pos, sp.start));
1615
+ const m = metaFor(sp.label);
1616
+ parts.push(S.activeCats.has(sp.label) ? `[${m.label.toUpperCase()}]` : S.text.substring(sp.start, sp.end));
1617
+ pos = sp.end;
1618
+ }
1619
+ if (pos < S.text.length) parts.push(S.text.substring(pos));
1620
+ return parts.join('');
1621
+ }
1622
+
1623
+ function download(name, content, type){
1624
+ const blob = content instanceof Blob ? content : new Blob([content], { type: type || 'text/plain' });
1625
+ const a = document.createElement('a');
1626
+ a.href = URL.createObjectURL(blob); a.download = name;
1627
+ document.body.appendChild(a); a.click(); a.remove();
1628
+ setTimeout(() => URL.revokeObjectURL(a.href), 1000);
1629
+ }
1630
+
1631
+ function baseName(){
1632
+ const f = S.filename || 'document';
1633
+ const i = f.lastIndexOf('.');
1634
+ return i > 0 ? f.slice(0, i) : f;
1635
+ }
1636
+
1637
+ document.getElementById('act-txt').addEventListener('click', () => {
1638
+ download(baseName() + '.redacted.txt', sanitizedText(), 'text/plain');
1639
+ flash('act-txt', 'Exported');
1640
+ });
1641
+ document.getElementById('act-copy').addEventListener('click', async () => {
1642
+ try { await navigator.clipboard.writeText(sanitizedText()); flash('act-copy', 'Copied'); }
1643
+ catch { flash('act-copy', 'Copy failed'); }
1644
+ });
1645
+ document.getElementById('act-report').addEventListener('click', () => {
1646
+ const report = {
1647
+ filename: S.filename,
1648
+ scanned_in_ms: Math.round(S.scanMs),
1649
+ stats: S.stats,
1650
+ speakers: S.speakers,
1651
+ active_categories: [...S.activeCats],
1652
+ spans: S.spans,
1653
+ };
1654
+ download(baseName() + '.report.json', JSON.stringify(report, null, 2), 'application/json');
1655
+ flash('act-report', 'Downloaded');
1656
+ });
1657
+
1658
+ document.getElementById('act-pdf').addEventListener('click', async () => {
1659
+ if (!isPdf()) return;
1660
+ if (!S.file) {
1661
+ alert('Original PDF reference lost β€” upload again to export a redacted PDF.');
1662
+ return;
1663
+ }
1664
+ if (!S.activeCats.size) {
1665
+ alert('No categories selected β€” enable at least one category in the sidebar before redacting.');
1666
+ return;
1667
+ }
1668
+ const btn = document.getElementById('act-pdf');
1669
+ const labelSpan = btn.querySelector('span');
1670
+ const original = labelSpan.textContent;
1671
+ labelSpan.textContent = 'Generating…';
1672
+ btn.disabled = true;
1673
+ try {
1674
+ const form = new FormData();
1675
+ form.append('file', S.file);
1676
+ form.append('spans', JSON.stringify(S.spans));
1677
+ form.append('active', JSON.stringify([...S.activeCats]));
1678
+ const r = await fetch('/api/redact-pdf', { method:'POST', body: form });
1679
+ if (!r.ok) {
1680
+ let err = `Redaction failed (${r.status})`;
1681
+ try { const j = await r.json(); err = j.error || err; } catch {}
1682
+ throw new Error(err);
1683
+ }
1684
+ const blob = await r.blob();
1685
+ download(baseName() + '.redacted.pdf', blob, 'application/pdf');
1686
+ labelSpan.textContent = 'Downloaded';
1687
+ } catch (e) {
1688
+ labelSpan.textContent = 'Failed';
1689
+ alert(e.message || 'Redaction failed');
1690
+ } finally {
1691
+ btn.disabled = false;
1692
+ setTimeout(() => { labelSpan.textContent = original; }, 1500);
1693
+ }
1694
+ });
1695
+
1696
+ const _flashTimers = {};
1697
+ function flash(id, msg){
1698
+ const btn = document.getElementById(id);
1699
+ const span = btn.querySelector('span');
1700
+ const prev = span ? span.textContent : btn.textContent;
1701
+ if (span) span.textContent = msg; else btn.textContent = msg;
1702
+ clearTimeout(_flashTimers[id]);
1703
+ _flashTimers[id] = setTimeout(() => { if (span) span.textContent = prev; else btn.textContent = prev; }, 1300);
1704
+ }
1705
+ </script>
1706
+ </body>
1707
+ </html>"""
1708
+
1709
+ # ── launch ───────────────────────────────────────────────────────
1710
+ if __name__ == "__main__":
1711
+ server.launch(server_name="0.0.0.0", server_port=7860)