colorization / docs /pipeline.md
stvident's picture
Upload folder using huggingface_hub
96a945a verified

Study Pipeline

Overview

A one-page web app that runs a perceptual discrimination study. Participants see images one at a time and judge whether colorization artifacts are present or absent. No login, no score reveal β€” pure signal detection data collection.


Participant Flow

Welcome / Colorblindness Screen
  └─ 3 Ishihara plates (demo β†’ red-green β†’ blue-yellow)
  └─ Expertise dropdown (novice / hobbyist / professional / researcher)
        β”‚
        β–Ό
Tutorial (3 steps)
  Step 1 β€” artifact reference: what artifacts look like (static explainer image)
  Step 2 β€” practice real: participant sees a ground-truth image, makes a judgment, gets feedback
  Step 3 β€” practice fake: participant sees a colorized image, makes a judgment, gets feedback
        β”‚
        β–Ό
50 Trial Loop
  β”œβ”€ Image displayed (fixed-size container, objectFit: contain)
  β”œβ”€ Participant responds: ABSENT (←) or PRESENT (β†’)
  β”‚     β€’ Swipe right / β†’ key / green button = PRESENT (artifacts detected)
  β”‚     β€’ Swipe left  / ← key / red button   = ABSENT  (no artifacts)
  β”œβ”€ Response + timing logged to backend (fire-and-forget)
  └─ Repeat until trial 50
        β”‚
        β–Ό
Done Screen
  └─ Thank-you message + share link

Image Sampling (per session)

  • 10 ground-truth (real) images sampled randomly from 15 available
  • 8 colorized (fake) images per method Γ— 5 methods = 40
  • Total: 50 trials, shuffled
  • Balance enforced: within each method, 1 image is drawn from each of the 6 (variant Γ— dataset) groups (6 groups Γ— 1 = 6; 2 extra drawn randomly from remaining pool)
  • GT varies session-to-session; same 15 base scenes, different 10 selected each time

Methods: bigcolor, ddcolor, disco, unicolor, mixed
Variants: ortho, standard
Datasets: coco, imagenet, instance


Response Encoding

User action Button Color response value stored Meaning
Swipe right / β†’ PRESENT Green fake Participant detected artifacts
Swipe left / ← ABSENT Red real Participant saw no artifacts

label field encodes ground truth: fake (colorized) or real (ground-truth photo).
correct = 1 when response == label.


Tech Stack

Layer Tech
Frontend React 18 + Vite + MUI v5 dark theme
Backend FastAPI + aiosqlite
Database SQLite (local: ./responses.db, Fly.io: /app/data/responses.db)
Deployment Fly.io β€” single Docker container, persistent volume
Session state localStorage (resume on reload, UUID per participant)

Key Files

colorization-webapp/
β”œβ”€β”€ backend/
β”‚   β”œβ”€β”€ main.py            β€” FastAPI app, static mounts
β”‚   β”œβ”€β”€ db.py              β€” SQLite init, connection
β”‚   └── routes/
β”‚       β”œβ”€β”€ session.py     β€” /api/session/start, /respond, /complete + sampling logic
β”‚       └── results.py     β€” /api/results/csv, /summary (key-protected)
β”œβ”€β”€ frontend/src/
β”‚   β”œβ”€β”€ pages/
β”‚   β”‚   β”œβ”€β”€ Welcome.jsx    β€” colorblindness plates + expertise
β”‚   β”‚   β”œβ”€β”€ Tutorial.jsx   β€” 3-step practice
β”‚   β”‚   β”œβ”€β”€ Trial.jsx      β€” main 50-trial loop
β”‚   β”‚   └── Done.jsx       β€” thank-you + share
β”‚   └── components/
β”‚       β”œβ”€β”€ SwipeCard.jsx  β€” touch/drag swipe handler
β”‚       └── ProgressBar.jsx
β”œβ”€β”€ image_samples/         β€” 165 images (served at /images/...)
β”œβ”€β”€ tutorial/              β€” tutorial assets + Ishihara plates
β”œβ”€β”€ manifest.json          β€” image metadata index
β”œβ”€β”€ Dockerfile
└── fly.toml

Admin Endpoints

Both require ?key=colorturingtest2025.

Endpoint Description
GET /api/results/csv?key=colorturingtest2025 Download full response CSV
GET /api/results/summary?key=colorturingtest2025 JSON summary: per-method detection rates, overall accuracy