# Running on Modal + Image Rendering Guide ## 1. Running Training on Modal ### Setup ```bash pip install modal modal setup # Authenticate with your Modal token ``` ### Create a Modal Secret for HuggingFace ```bash modal secret create huggingface-token HF_TOKEN=your_hf_token_here ``` ### Deploy & Run The repo includes `modal_train.py`. Simply: ```bash cd parametric-floorplan-generator modal run modal_train.py ``` This will: 1. Spin up a CPU container to generate the 5,000-example synthetic dataset (saved to a Modal Volume) 2. Spin up an **A10G GPU** container to fine-tune Qwen2.5-1.5B-Instruct with LoRA 3. Push the trained model to HuggingFace Hub ### Customize GPU Edit `modal_train.py` and change the GPU: ```python @app.function(gpu="A100-40GB", ...) # or "T4", "H100" ``` ### Check Progress ```bash modal app logs floorplan-trainer ``` --- ## 2. Generating Floorplan Images The model outputs **JSON** with room polygons. To convert to visual plans: ### Option A: SVG (Vector, best for CAD/printing) ```bash # Generate a floorplan first python generate.py \ --plot_length 15 --plot_width 12 \ --setback_front 1.5 --setback_rear 1.0 \ --setback_left 1.0 --setback_right 1.0 \ --road_side N --num_bedrooms 3 --toilets 3 \ --parking --has_pooja --has_balcony \ --num_floors 2 --city Delhi > myhouse.json # Render to SVG python render_floorplan.py --input myhouse.json --output myhouse.svg ``` The SVG includes: - **Plot boundary** (thick black line) - **Buildable boundary** (dashed gray) - **Rooms** color-coded by type (living=blue, bedroom=orange, kitchen=purple, etc.) - **Door openings** (green lines) - **Windows** (blue dashed lines) - **Room labels** with names and areas - **Dimension annotations** - **North arrow** - **Legend** ### Option B: PNG (Raster, best for web/presentations) ```bash pip install cairosvg python render_floorplan.py --input myhouse.json --output myhouse.png ``` ### Option C: Interactive Web Viewer (Gradio) Deploy as a HuggingFace Space: ```python import gradio as gr import json from render_floorplan import render_floorplan_svg def generate_and_render(params_json): floorplan = json.loads(model_output) svg = render_floorplan_svg(floorplan, width=1200) return svg gr.Interface( fn=generate_and_render, inputs=gr.JSON(label="Project Parameters"), outputs=gr.HTML(label="Floorplan SVG"), title="Parametric Floorplan Generator" ).launch() ``` ### Option D: CAD Export (DXF) For professional CAD output, extend `render_floorplan.py` to write DXF using `ezdxf`: ```bash pip install ezdxf ``` Then iterate over the JSON polygons and write them as DXF polylines. --- ## 3. Complete Pipeline on Modal You can also run inference + rendering as a Modal endpoint: ```python import modal app = modal.App("floorplan-api") @app.function(gpu="T4", image=modal.Image.debian_slim().pip_install("transformers", "torch", "accelerate", "cairosvg")) @modal.web_endpoint(method="POST") def generate(params: dict): # 1. Run model inference # 2. Render SVG # 3. Convert to PNG # 4. Return image pass ``` This gives you an HTTP API that takes your ProjectCreate JSON and returns a PNG floorplan. --- ## 4. Summary of Commands ```bash # Clone repo git clone https://huggingface.co/Karthik8nitt/parametric-floorplan-generator cd parametric-floorplan-generator # Install locally pip install transformers trl torch datasets peft accelerate trackio pip install cairosvg # for PNG rendering # Train on Modal modal run modal_train.py # Generate floorplan python generate.py --plot_length 15 --plot_width 12 ... > plan.json # Render python render_floorplan.py --input plan.json --output plan.svg python render_floorplan.py --input plan.json --output plan.png ```