LucasLooTan commited on
Commit
277d6c0
Β·
1 Parent(s): 0e615a1

docs(readme): tighten Why AMD + add For Deaf-led teams section

Browse files

HF Space card now mirrors the walkthrough's substrate-not-product
framing and concrete MI300X-vs-H100 memory headroom story:
- 'Why AMD' replaced with a single tight paragraph: 192 GB HBM3 +
5.3 TB/s bandwidth fits the V2 70B-reasoner upgrade that NVIDIA
H100 cannot match without a 3-GPU cluster.
- New 'For Deaf-led teams' section between Privacy and Local dev,
pointing at docs/walkthrough.md β†’ Deployment ethics for the full
Deaf-community engagement principles.

Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -44,7 +44,7 @@ V1 is **one-way**: deaf signs β†’ hearing hears. Reverse direction (speech β†’ o
44
 
45
  ## Why AMD
46
 
47
- The MI300X's 192 GB HBM3 and 5.3 TB/s memory bandwidth let the entire multi-stage pipeline (sign classifier + Llama-3.1-8B + XTTS-v2) run concurrently on a single GPU. Bandwidth-bound streaming workload is the textbook MI300X use case. Practical accessibility tools running globally need the cost-and-availability profile that AMD enables.
48
 
49
  ## Why this matters (business case)
50
 
@@ -54,6 +54,10 @@ Sign-language interpreters cost **$50–200 per hour** and are scarce. Courts, h
54
 
55
  Session-only. Frames and audio are processed in-memory and not persisted server-side beyond the WebSocket / HTTP session.
56
 
 
 
 
 
57
  ## Local dev
58
 
59
  ```bash
 
44
 
45
  ## Why AMD
46
 
47
+ The MI300X's 192 GB HBM3 fits the entire pipeline (Qwen3-VL-8B + Llama-3.1-8B + XTTS-v2) on one GPU with margin. NVIDIA H100 (80 GB) requires sharding, and the V2 plan to upgrade to a 70B reasoner is impossible on H100 without a 3-GPU cluster. Single-GPU concurrency + 5.3 TB/s memory bandwidth is the actual AMD pitch β€” practical accessibility tools running globally need the cost-and-availability profile that AMD enables.
48
 
49
  ## Why this matters (business case)
50
 
 
54
 
55
  Session-only. Frames and audio are processed in-memory and not persisted server-side beyond the WebSocket / HTTP session.
56
 
57
+ ## For Deaf-led teams
58
+
59
+ SignBridge is open-source under MIT license and intentionally scoped to ASL-only V1. The pipeline is a substrate, not a finished product β€” Deaf-led organisations (schools-for-the-Deaf, NGOs, ministries) are the intended deployers. Other sign languages (BSL, MSL, CSL, ISL, +200 more) deserve their own teams, training data, and Deaf community leadership. See [`docs/walkthrough.md`](docs/walkthrough.md) β†’ "Deployment ethics" for the design principles drawn from the Deaf-led academic literature.
60
+
61
  ## Local dev
62
 
63
  ```bash