| # PerceptPick — pre-prepared assets |
|
|
| This bundle holds the URDF / VHACD / mesh assets for the YCB-V dataset |
| across nine mesh sources (oracle CAD plus eight reconstruction methods), |
| together with FoundationPose and MegaPose pose-estimator CSVs. |
|
|
| Drop into a `perceptpick` clone to skip Stage A (`01_prepare_assets.py`) |
| and the FoundationPose / MegaPose pipelines: |
|
|
| ``` |
| git clone <perceptpick> |
| cd perceptpick |
| |
| # 1. download the BOP YCBV test split (scenes 48-59 + models) |
| # see the README's "Get the YCB-Video dataset" section. |
| |
| # 2. unpack this bundle next to the repo |
| unzip perceptpick_assets.zip |
| |
| # 3. wire the bundle into the expected path |
| mkdir -p assets |
| mv perceptpick_assets/ycbv assets/ycbv |
| ``` |
|
|
| After that, jump straight to Stage B / C — no Stage A re-prep needed. |
|
|
| ## Layout |
|
|
| ``` |
| ycbv/ |
| ├── GT/ # oracle CAD (BOP YCBV models) |
| │ ├── meshes/obj_NNNNNN.{obj,mtl,png} |
| │ ├── vhacd/obj_NNNNNN_vhacd.obj |
| │ ├── urdf/obj_NNNNNN.urdf |
| │ └── pose_estimates/ |
| │ ├── FoundationPose.csv # FoundationPose on GT meshes |
| │ └── MegaPose.csv # MegaPose on GT meshes |
| ├── BakedSDF/ # 8 reconstruction methods |
| │ ├── meshes/, vhacd/, urdf/ |
| │ └── pose_estimates/ |
| │ ├── FoundationPose.csv # FoundationPose on BakedSDF |
| │ └── MegaPose.csv # MegaPose on BakedSDF |
| ├── MonoSDF/, Nerfacto/, Neuralangelo/ |
| ├── NGP/, RealCAP/, UniSurf/, VolSDF/ |
| ``` |
|
|
| Each method folder is fully self-contained: the meshes the simulator |
| loads, the URDFs and VHACDs the physics layer needs, and the pose CSVs |
| that were generated using *that* mesh as the pose-estimator's reference |
| model. The CSVs are tiny; the meshes / VHACDs make up almost all of the |
| disk footprint. |
|
|
| ## URDF paths |
|
|
| URDFs reference the sibling collision mesh with a relative path: |
| `<mesh filename="../vhacd/obj_NNNNNN_vhacd.obj"/>`. No absolute paths, |
| no system-specific roots — the bundle is portable. |
|
|
| ## Running the benchmark |
|
|
| ```bash |
| # Stage B — sample antipodal grasps + simulate, per (object, gripper) on the GT meshes |
| pixi run python scripts/02_grasp_sweep.py --dataset ycbv --mesh-source GT --n-grasps 5000 |
| |
| # Stage C, Condition 1 — Oracle / Oracle (ideal baseline) |
| pixi run python scripts/04_evaluate.py --dataset ycbv \ |
| --gt-mesh GT --est-mesh GT \ |
| --pose-csv FoundationPose.csv --gripper auto --workers 4 --resume --headless |
| |
| # Stage C, Condition 3 — End-to-end realistic (BakedSDF mesh + BakedSDF-conditioned pose) |
| pixi run python scripts/04_evaluate.py --dataset ycbv \ |
| --gt-mesh BakedSDF --est-mesh BakedSDF \ |
| --pose-csv FoundationPose.csv --gripper auto --workers 4 --resume --headless |
| ``` |
|
|
| If you'd rather regenerate the assets from scratch (e.g. to verify VHACD |
| parameters), ignore this bundle and run |
| `scripts/01_prepare_assets.py --dataset ycbv --all-mesh-sources`. |
|
|