Dataset Viewer

The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

shapes-mhr

MHR (Momentum Human Rig) articulated body model fits exported by the src/sam3d_grounding/run.py pipeline in sam3d-for-shape-corr.

Directory Layout

shape-mhr/
└── <dataset>/               # one subfolder per dataset (e.g., faust_r)
    ├── mhr/                 # MHR outputs — one pair of files per input mesh
    │   ├── <name>.obj           # Aligned MHR mesh
    │   └── <name>_mhr_params.npy  # MHR parameters & alignment transforms
    └── off/                 # Original input meshes (copied from source dataset)
        └── <name>.off

File Formats

mhr/<name>.obj — Aligned MHR Mesh

A triangulated Wavefront OBJ mesh of the fitted MHR body model, aligned to the corresponding input shape.

Property Value
Vertices 18 439 (fixed — shared topology across all shapes)
Faces 36 874 (fixed — shared topology across all shapes)
Coordinate unit metres (same scale as the source dataset)
Axis convention Y-up, same as source dataset axis order

Because all shapes in a dataset share the same MHR topology, vertex index i of mesh A corresponds to the same anatomical location as vertex index i of mesh B — enabling direct vertex-to-vertex correspondence across shapes.

mhr/<name>_mhr_params.npy — Parameters

A NumPy .npy file saved with allow_pickle=True containing a Python dict with the following keys:

Key Shape dtype Description
identity_coeffs (B, 45) float32 MHR shape identity coefficients (body, head, hands)
model_parameters (B, 204) float32 MHR joint angles and scalings
face_expr_coeffs (B, 72) float32 MHR facial expression blendshape coefficients
R (B, 3, 3) float32 Per-view rotation matrices (MHR → input mesh space)
t (B, 3) float32 Per-view translation vectors
s (B,) float32 Per-view scale scalars

B is the number of camera views used in the final surface optimisation stage (default: 3 best views selected by Chamfer distance).

Important — params vs. mesh: The exported .obj is produced by generating a mesh from each of the B parameter sets independently, then averaging their vertex coordinates (not the parameters). The B rows in identity_coeffs / model_parameters are therefore B distinct parameter solutions — each a plausible fit optimised from a different camera viewpoint. When using these params for training or inference, treat each row as an independent candidate solution rather than as a batch that should be averaged or pooled together.

Loading example:

import numpy as np

params = np.load("tr_reg_000_mhr_params.npy", allow_pickle=True).item()
identity_coeffs  = params["identity_coeffs"]   # (B, 45)
model_parameters = params["model_parameters"]  # (B, 204)
R, t, s          = params["R"], params["t"], params["s"]

Generation Pipeline

Each MHR mesh is produced by a 6-step pipeline:

  1. Multi-view rendering — input mesh rendered from multiple camera angles (azimuth × elevation grid).
  2. SAM 3D Body inference — keypoints and depth extracted per view.
  3. Keypoint triangulation — 3D keypoints triangulated from multi-view 2D detections.
  4. Keypoint optimisation — MHR model_parameters optimised to match triangulated keypoints.
  5. Surface optimisationidentity_coeffs and model_parameters jointly refined using Chamfer distance against the (optionally trimmed) input mesh, with optional bone-length and keypoint regularisation terms.
  6. Multi-view merging — per-view MHR vertices averaged (weighted by Chamfer distance) to produce the final mesh.

Datasets

Subfolder Source # Shapes Input format
faust_r FAUST remeshed 100 .off

Correspondence

Because MHR meshes share a fixed topology (18 439 vertices, 36 874 faces), a vertex-to-vertex functional correspondence between any two shapes in the same dataset is established directly by index — vertex i in one MHR mesh maps to vertex i in any other.

This can be exploited for:

  • Spectral (functional map) correspondence methods
  • Texture transfer across shapes
  • Training data generation for shape correspondence networks
Downloads last month
304