JakgritB commited on
Commit
d8bf65d
·
1 Parent(s): a05a82e

docs: add Hugging Face Space landing page

Browse files

Add static Space assets and upload instructions for the lablab.ai AMD Developer Hackathon public demo page.

Files changed (3) hide show
  1. hf-space/README.md +52 -0
  2. hf-space/UPLOAD.md +36 -0
  3. hf-space/index.html +393 -0
hf-space/README.md ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ElevenClip.AI
2
+
3
+ ElevenClip.AI is an AI clip studio that turns long-form videos into personalized short-form clips for TikTok, YouTube Shorts, and Instagram Reels.
4
+
5
+ This Space is the public demo page for our AMD Developer Hackathon project on lablab.ai.
6
+
7
+ ## Hackathon Track
8
+
9
+ **Track 3 - Vision & Multimodal AI**
10
+
11
+ ElevenClip.AI processes video, audio, text, and subtitle outputs:
12
+
13
+ - **Audio**: Whisper Large V3 transcription through Hugging Face.
14
+ - **Text**: Qwen2.5 highlight scoring from timestamped transcripts.
15
+ - **Video**: optional Qwen2-VL visual analysis for reactions, scene changes, and on-screen text.
16
+ - **Rendered media**: ffmpeg exports vertical short-form clips with subtitles.
17
+
18
+ ## Why It Matters
19
+
20
+ Creators often publish long videos but still need short clips for discovery platforms. Finding the best moments, trimming clips, writing subtitles, and exporting vertical videos can take hours.
21
+
22
+ ElevenClip.AI automates the first pass:
23
+
24
+ 1. Upload a video or paste a YouTube URL.
25
+ 2. Transcribe the content with Whisper.
26
+ 3. Use Qwen to identify high-engagement highlight moments.
27
+ 4. Render short-form clips with subtitles.
28
+ 5. Let the creator trim, edit subtitles, approve, regenerate, and download final clips.
29
+
30
+ ## AMD + ROCm Plan
31
+
32
+ The production backend is designed for:
33
+
34
+ - AMD Developer Cloud
35
+ - AMD Instinct MI300X
36
+ - ROCm 6.x
37
+ - PyTorch with ROCm support
38
+ - vLLM ROCm backend for Qwen inference
39
+
40
+ Current local demo mode validates the upload, clipping, subtitle rendering, and human editing workflow. Real Whisper and Qwen inference will be enabled on AMD Developer Cloud once MI300X credits are active.
41
+
42
+ ## Links
43
+
44
+ - GitHub: https://github.com/JakgritB/ElevenClip.AI
45
+ - Hackathon: https://lablab.ai/ai-hackathons/amd-developer
46
+
47
+ ## Status
48
+
49
+ - Local MVP: working
50
+ - Hugging Face Space: landing page
51
+ - AMD Cloud credits: requested
52
+ - Real MI300X benchmark: pending
hf-space/UPLOAD.md ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # How to Update the Hugging Face Space
2
+
3
+ Space URL:
4
+
5
+ ```text
6
+ https://huggingface.co/spaces/lablab-ai-amd-developer-hackathon/ElevenClip-AI
7
+ ```
8
+
9
+ ## Manual Upload
10
+
11
+ 1. Open the Space.
12
+ 2. Click `Files`.
13
+ 3. Open `index.html`.
14
+ 4. Click the edit pencil.
15
+ 5. Replace the default file with the contents of `hf-space/index.html`.
16
+ 6. Commit the change.
17
+ 7. Add or edit `README.md`.
18
+ 8. Replace it with the contents of `hf-space/README.md`.
19
+ 9. Commit the change.
20
+ 10. Go back to the `App` tab.
21
+
22
+ The Space should now show the ElevenClip.AI landing page instead of the default static Space message.
23
+
24
+ ## Git Option
25
+
26
+ If you have Hugging Face Git credentials configured:
27
+
28
+ ```bash
29
+ git clone https://huggingface.co/spaces/lablab-ai-amd-developer-hackathon/ElevenClip-AI
30
+ cp hf-space/index.html ElevenClip-AI/index.html
31
+ cp hf-space/README.md ElevenClip-AI/README.md
32
+ cd ElevenClip-AI
33
+ git add index.html README.md
34
+ git commit -m "Add ElevenClip.AI landing page"
35
+ git push
36
+ ```
hf-space/index.html ADDED
@@ -0,0 +1,393 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <!doctype html>
2
+ <html lang="en">
3
+ <head>
4
+ <meta charset="UTF-8" />
5
+ <meta name="viewport" content="width=device-width, initial-scale=1.0" />
6
+ <title>ElevenClip.AI</title>
7
+ <style>
8
+ :root {
9
+ color-scheme: dark;
10
+ font-family:
11
+ Inter, ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", sans-serif;
12
+ background: #07111f;
13
+ color: #f8fafc;
14
+ }
15
+
16
+ * {
17
+ box-sizing: border-box;
18
+ }
19
+
20
+ body {
21
+ margin: 0;
22
+ background:
23
+ linear-gradient(120deg, rgba(20, 184, 166, 0.2), transparent 34%),
24
+ linear-gradient(150deg, rgba(245, 158, 11, 0.14), transparent 48%),
25
+ #07111f;
26
+ }
27
+
28
+ a {
29
+ color: inherit;
30
+ }
31
+
32
+ .shell {
33
+ max-width: 1120px;
34
+ margin: 0 auto;
35
+ padding: 40px 20px 56px;
36
+ }
37
+
38
+ .nav {
39
+ display: flex;
40
+ align-items: center;
41
+ justify-content: space-between;
42
+ gap: 20px;
43
+ margin-bottom: 72px;
44
+ }
45
+
46
+ .brand {
47
+ display: flex;
48
+ align-items: center;
49
+ gap: 12px;
50
+ font-weight: 800;
51
+ }
52
+
53
+ .mark {
54
+ display: grid;
55
+ width: 42px;
56
+ height: 42px;
57
+ place-items: center;
58
+ border-radius: 10px;
59
+ background: #f8fafc;
60
+ color: #07111f;
61
+ font-size: 22px;
62
+ }
63
+
64
+ .nav-links {
65
+ display: flex;
66
+ flex-wrap: wrap;
67
+ gap: 10px;
68
+ }
69
+
70
+ .button {
71
+ display: inline-flex;
72
+ min-height: 42px;
73
+ align-items: center;
74
+ justify-content: center;
75
+ border-radius: 8px;
76
+ border: 1px solid rgba(248, 250, 252, 0.18);
77
+ padding: 0 14px;
78
+ text-decoration: none;
79
+ font-weight: 750;
80
+ }
81
+
82
+ .button.primary {
83
+ border-color: #14b8a6;
84
+ background: #14b8a6;
85
+ color: #042f2e;
86
+ }
87
+
88
+ .hero {
89
+ display: grid;
90
+ grid-template-columns: minmax(0, 1.05fr) minmax(320px, 0.95fr);
91
+ gap: 36px;
92
+ align-items: center;
93
+ }
94
+
95
+ .eyebrow {
96
+ margin: 0 0 14px;
97
+ color: #fbbf24;
98
+ font-size: 0.82rem;
99
+ font-weight: 850;
100
+ text-transform: uppercase;
101
+ }
102
+
103
+ h1 {
104
+ margin: 0;
105
+ max-width: 820px;
106
+ font-size: clamp(2.35rem, 6vw, 5.7rem);
107
+ line-height: 0.95;
108
+ letter-spacing: 0;
109
+ }
110
+
111
+ .lead {
112
+ max-width: 670px;
113
+ margin: 24px 0 0;
114
+ color: #cbd5e1;
115
+ font-size: 1.16rem;
116
+ line-height: 1.7;
117
+ }
118
+
119
+ .actions {
120
+ display: flex;
121
+ flex-wrap: wrap;
122
+ gap: 12px;
123
+ margin-top: 28px;
124
+ }
125
+
126
+ .demo {
127
+ overflow: hidden;
128
+ border: 1px solid rgba(148, 163, 184, 0.24);
129
+ border-radius: 10px;
130
+ background: rgba(15, 23, 42, 0.76);
131
+ box-shadow: 0 24px 70px rgba(0, 0, 0, 0.26);
132
+ }
133
+
134
+ .demo-top {
135
+ display: flex;
136
+ gap: 8px;
137
+ border-bottom: 1px solid rgba(148, 163, 184, 0.18);
138
+ padding: 14px;
139
+ }
140
+
141
+ .dot {
142
+ width: 10px;
143
+ height: 10px;
144
+ border-radius: 999px;
145
+ background: #14b8a6;
146
+ }
147
+
148
+ .dot:nth-child(2) {
149
+ background: #fbbf24;
150
+ }
151
+
152
+ .dot:nth-child(3) {
153
+ background: #38bdf8;
154
+ }
155
+
156
+ .demo-body {
157
+ display: grid;
158
+ gap: 14px;
159
+ padding: 18px;
160
+ }
161
+
162
+ .progress {
163
+ height: 10px;
164
+ overflow: hidden;
165
+ border-radius: 999px;
166
+ background: #334155;
167
+ }
168
+
169
+ .progress span {
170
+ display: block;
171
+ width: 78%;
172
+ height: 100%;
173
+ border-radius: 999px;
174
+ background: linear-gradient(90deg, #14b8a6, #f59e0b);
175
+ }
176
+
177
+ .clip-grid {
178
+ display: grid;
179
+ grid-template-columns: repeat(2, 1fr);
180
+ gap: 12px;
181
+ }
182
+
183
+ .clip {
184
+ min-height: 184px;
185
+ border-radius: 8px;
186
+ border: 1px solid rgba(148, 163, 184, 0.2);
187
+ background:
188
+ linear-gradient(180deg, transparent 48%, rgba(0, 0, 0, 0.58)),
189
+ linear-gradient(135deg, #1e293b, #0f766e 52%, #f59e0b);
190
+ padding: 12px;
191
+ display: flex;
192
+ align-items: flex-end;
193
+ }
194
+
195
+ .caption {
196
+ color: white;
197
+ font-size: 0.84rem;
198
+ font-weight: 800;
199
+ text-shadow: 0 2px 8px rgba(0, 0, 0, 0.7);
200
+ }
201
+
202
+ .section {
203
+ margin-top: 76px;
204
+ }
205
+
206
+ .section h2 {
207
+ margin: 0 0 18px;
208
+ font-size: 1.5rem;
209
+ }
210
+
211
+ .cards {
212
+ display: grid;
213
+ grid-template-columns: repeat(4, minmax(0, 1fr));
214
+ gap: 14px;
215
+ }
216
+
217
+ .card {
218
+ border: 1px solid rgba(148, 163, 184, 0.22);
219
+ border-radius: 10px;
220
+ background: rgba(15, 23, 42, 0.72);
221
+ padding: 18px;
222
+ }
223
+
224
+ .card strong {
225
+ display: block;
226
+ margin-bottom: 8px;
227
+ }
228
+
229
+ .card p {
230
+ margin: 0;
231
+ color: #cbd5e1;
232
+ line-height: 1.55;
233
+ }
234
+
235
+ .status {
236
+ display: grid;
237
+ grid-template-columns: repeat(2, minmax(0, 1fr));
238
+ gap: 14px;
239
+ }
240
+
241
+ .status-row {
242
+ display: flex;
243
+ justify-content: space-between;
244
+ gap: 14px;
245
+ border-bottom: 1px solid rgba(148, 163, 184, 0.18);
246
+ padding: 12px 0;
247
+ }
248
+
249
+ .status-row span {
250
+ color: #cbd5e1;
251
+ }
252
+
253
+ footer {
254
+ margin-top: 76px;
255
+ color: #94a3b8;
256
+ font-size: 0.92rem;
257
+ }
258
+
259
+ @media (max-width: 860px) {
260
+ .hero,
261
+ .cards,
262
+ .status {
263
+ grid-template-columns: 1fr;
264
+ }
265
+
266
+ .nav {
267
+ align-items: flex-start;
268
+ flex-direction: column;
269
+ margin-bottom: 44px;
270
+ }
271
+
272
+ .clip-grid {
273
+ grid-template-columns: 1fr;
274
+ }
275
+ }
276
+ </style>
277
+ </head>
278
+ <body>
279
+ <main class="shell">
280
+ <nav class="nav">
281
+ <div class="brand">
282
+ <div class="mark">✂</div>
283
+ <div>
284
+ <div>ElevenClip.AI</div>
285
+ <small>AMD ROCm multimodal clipping studio</small>
286
+ </div>
287
+ </div>
288
+ <div class="nav-links">
289
+ <a class="button" href="https://github.com/JakgritB/ElevenClip.AI">GitHub</a>
290
+ <a class="button primary" href="https://lablab.ai/ai-hackathons/amd-developer">Hackathon</a>
291
+ </div>
292
+ </nav>
293
+
294
+ <section class="hero">
295
+ <div>
296
+ <p class="eyebrow">AMD Developer Hackathon · Track 3</p>
297
+ <h1>Turn long videos into short-form clips.</h1>
298
+ <p class="lead">
299
+ ElevenClip.AI uses Whisper, Qwen, Hugging Face, and AMD ROCm on MI300X to
300
+ find highlight moments, render subtitles, and give creators a human-AI editor
301
+ for TikTok, Shorts, and Reels.
302
+ </p>
303
+ <div class="actions">
304
+ <a class="button primary" href="https://github.com/JakgritB/ElevenClip.AI">View source</a>
305
+ <a class="button" href="#status">Project status</a>
306
+ </div>
307
+ </div>
308
+
309
+ <div class="demo" aria-label="ElevenClip demo mockup">
310
+ <div class="demo-top">
311
+ <span class="dot"></span>
312
+ <span class="dot"></span>
313
+ <span class="dot"></span>
314
+ </div>
315
+ <div class="demo-body">
316
+ <strong>Pipeline</strong>
317
+ <div class="progress"><span></span></div>
318
+ <div class="clip-grid">
319
+ <div class="clip"><div class="caption">"The moment viewers stop scrolling"</div></div>
320
+ <div class="clip"><div class="caption">"A practical takeaway in 60 seconds"</div></div>
321
+ </div>
322
+ </div>
323
+ </div>
324
+ </section>
325
+
326
+ <section class="section">
327
+ <h2>How It Works</h2>
328
+ <div class="cards">
329
+ <article class="card">
330
+ <strong>1. Ingest</strong>
331
+ <p>Paste a YouTube URL or upload a video file for processing.</p>
332
+ </article>
333
+ <article class="card">
334
+ <strong>2. Transcribe</strong>
335
+ <p>Whisper Large V3 creates timestamped multilingual transcripts.</p>
336
+ </article>
337
+ <article class="card">
338
+ <strong>3. Score</strong>
339
+ <p>Qwen2.5 ranks highlights using creator profile and engagement signals.</p>
340
+ </article>
341
+ <article class="card">
342
+ <strong>4. Edit</strong>
343
+ <p>Creators trim, edit subtitles, approve, regenerate, and download clips.</p>
344
+ </article>
345
+ </div>
346
+ </section>
347
+
348
+ <section class="section">
349
+ <h2>Hackathon Fit</h2>
350
+ <div class="cards">
351
+ <article class="card">
352
+ <strong>AMD Cloud</strong>
353
+ <p>Backend target is AMD Developer Cloud with Instinct MI300X.</p>
354
+ </article>
355
+ <article class="card">
356
+ <strong>ROCm</strong>
357
+ <p>Designed for PyTorch ROCm, vLLM ROCm backend, and Optimum-AMD.</p>
358
+ </article>
359
+ <article class="card">
360
+ <strong>Hugging Face</strong>
361
+ <p>Uses HF model hub for Whisper, Qwen2.5, and Qwen2-VL.</p>
362
+ </article>
363
+ <article class="card">
364
+ <strong>Multimodal</strong>
365
+ <p>Combines audio, text, video frames, subtitles, and rendered clips.</p>
366
+ </article>
367
+ </div>
368
+ </section>
369
+
370
+ <section id="status" class="section">
371
+ <h2>Project Status</h2>
372
+ <div class="status">
373
+ <div class="card">
374
+ <div class="status-row"><span>Local MVP</span><strong>Working</strong></div>
375
+ <div class="status-row"><span>Upload to clips</span><strong>Working</strong></div>
376
+ <div class="status-row"><span>Subtitle rendering</span><strong>Working</strong></div>
377
+ <div class="status-row"><span>Human editor</span><strong>Working</strong></div>
378
+ </div>
379
+ <div class="card">
380
+ <div class="status-row"><span>AMD Cloud credits</span><strong>Requested</strong></div>
381
+ <div class="status-row"><span>Real Whisper inference</span><strong>Pending</strong></div>
382
+ <div class="status-row"><span>Real Qwen inference</span><strong>Pending</strong></div>
383
+ <div class="status-row"><span>MI300X benchmark</span><strong>Pending</strong></div>
384
+ </div>
385
+ </div>
386
+ </section>
387
+
388
+ <footer>
389
+ Built for the AMD Developer Hackathon on lablab.ai. Source code is available on GitHub.
390
+ </footer>
391
+ </main>
392
+ </body>
393
+ </html>