NKSR Wrapper β Neural Kernel Surface Reconstruction
A clean, high-level Python wrapper around Neural Kernel Surface Reconstruction (NKSR) by Huang et al. (CVPR 2023, Highlight).
Drop a .ply or .pcd point cloud in β get a watertight, high-quality triangle mesh out.
π¦ What is NKSR?
Neural Kernel Surface Reconstruction is a deep-learning method that turns a raw, sparse, and potentially noisy point cloud into a smooth, watertight 3D mesh. Unlike classical methods (Poisson, Alpha shapes) it learns a continuous implicit surface from data, and unlike vanilla neural fields (NeRF, DeepSDF) it scales to millions of points and generalises across objects, rooms, and outdoor scenes without per-scene training.
The three key innovations
| Innovation | Why it matters |
|---|---|
| Compactly-supported kernel functions | The implicit field is built from local kernel basis functions that have finite support. This makes the linear system sparse, so it can be solved with fast sparse PCG solvers instead of dense matrix inversion. Result: room-scale reconstruction in seconds. |
| Gradient fitting solve | Instead of only fitting point positions (SDF β 0), NKSR also fits surface normals as gradients of the field. This makes the reconstruction dramatically more robust to noise and outliers. |
| Minimal training, maximum generalisation | The model is trained once on a mixture of synthetic and real data (the "kitchen-sink" config) and then works out-of-the-box on new scans without any fine-tuning. |
π Quick start
1. Install dependencies
NKSR itself contains custom CUDA kernels, so you need a working PyTorch + CUDA environment.
# 1. Clone the original NKSR repo and install it
# (see https://github.com/nv-tlabs/NKSR for the latest instructions)
git clone https://github.com/nv-tlabs/NKSR.git
cd NKSR
pip install -r requirements.txt
pip install --no-build-isolation package/
# 2. Install this wrapper
pip install -e .
2. One-liner reconstruction
from nksr_wrapper import NKSRMeshReconstructor, load_point_cloud, save_mesh
points, normals = load_point_cloud("scan.ply")
recon = NKSRMeshReconstructor(device="cuda:0")
mesh = recon.reconstruct(points, normals, detail_level=1.0)
save_mesh("mesh.ply", mesh.vertices, mesh.faces)
Or use the CLI:
python scripts/reconstruct.py scan.ply mesh.ply --detail 1.0 --mise-iter 1
π¬ How it works (the full pipeline)
If you want to understand what is happening under the hood, here is the step-by-step pipeline that NKSR executes every time you call reconstruct().
Step 0 β Input
You provide:
xyzβ (N, 3) point positionsnormalβ (N, 3) oriented normals (optional but strongly recommended)sensorβ (N, 3) sensor/camera positions (optional; used for normal orientation when normals are missing)
Step 1 β Voxelisation (Sparse Feature Hierarchy)
The input points are splatted into a sparse voxel grid at multiple resolutions (a quad-/octree-like structure called the Sparse Feature Hierarchy, SVH). Instead of a dense 3D array, only occupied voxels are stored. This is what lets NKSR handle millions of points without exploding memory.
Key parameter: voxel_size (default β 0.1 in the pretrained config). One voxel_size unit = one spatial unit in your point-cloud coordinate system.
Step 2 β Feature encoding (PointNet β Sparse 3D U-Net)
- PointEncoder β A small PointNet-style ResNet processes the raw points inside each voxel and produces a 32-dim feature vector per voxel.
- SparseStructureNet β A sparse 3D convolutional U-Net with skip connections processes these voxel features across multiple scales. It also predicts an adaptive structure: if a region is empty, the network stops subdividing early, saving computation.
Step 3 β Geometry field (Kernel Field)
This is the heart of NKSR.
The network outputs kernel basis parameters at each voxel. At any 3D query point x, the implicit function is evaluated as a weighted sum of compact kernel functions centred on nearby voxels:
f(x) = Ξ£_i w_i Β· Ο_i(x)
where Ο_i is a compact kernel (e.g. Wendland or similar) and w_i are learned weights. Because the kernels have finite support, the sum only involves neighbours within a small radius β sparse linear system.
Step 4 β Sparse linear solve (PCG)
NKSR now solves for the weights w by fitting two constraints:
- Position constraint:
f(x_j) β 0on every input point (the surface is the zero level-set). - Normal constraint:
βf(x_j) β n_jon voxel centres (the gradient of the field matches the surface normal).
These constraints are assembled into a large but sparse linear system and solved with a preconditioned conjugate-gradient (PCG) solver. The normal constraint is the secret sauce: it anchors the gradient of the field, making the reconstruction much less sensitive to noise than methods that only fit positions.
Step 5 β Mask / trimming field (optional)
A secondary field (either a learned UDF β Unsigned Distance Field β or a simple layer field) identifies regions that are outside the true surface. This trims away spurious floaters and fills small holes, producing a clean watertight boundary.
Step 6 β Mesh extraction (Dual Marching Cubes + MISE)
Finally, the zero level-set of the implicit field is turned into triangles:
- Dual Marching Cubes (DMC) is run on the dual graph of the sparse voxel hierarchy. DMC produces nicer topology than standard Marching Cubes (fewer skinny triangles, better sharp-feature preservation).
- MISE (Multi-resolution IsoSurface Extraction) adaptively subdivides cells that straddle the zero crossing. Each
mise_iterdoubles the effective resolution in those cells, giving you a crisp mesh without wasting polygons on empty space.
Result: mesh.v (VΓ3 vertices) and mesh.f (FΓ3 face indices).
π§° API Reference
NKSRMeshReconstructor
class NKSRMeshReconstructor(
device="cuda:0",
config="ks",
chunk_tmp_device="cpu",
)
deviceβ PyTorch device (CUDA strongly recommended).configβ Pretrained model name:"ks"β Kitchen-sink (recommended default). Trained on a mixture of synthetic and real scans; generalises to objects, indoor rooms, and outdoor scenes."snet"β ShapeNet objects with normals."snet-wonormal"β ShapeNet objects without normals.
chunk_tmp_deviceβ Where to stash finished chunks when reconstructing huge scenes."cpu"offloads to system RAM.
.reconstruct(...)
mesh = recon.reconstruct(
points, # (N, 3) required
normals=None, # (N, 3) optional, strongly recommended
sensor_positions=None, # (N, 3) optional, helps orient normals
colors=None, # (N, 3) optional, for colored mesh output
# Quality / resolution
detail_level=1.0, # 0.0 = smooth, 1.0 = max detail
voxel_size=None, # override resolution explicitly
mise_iter=1, # 0 = base, 1 = 2Γ in subdivided cells, 2 = 4Γ
# Large-scene settings
chunk_size=-1.0, # >0 enables out-of-core chunking
overlap_ratio=0.05,
# Solver tuning
solver_max_iter=2000,
solver_tol=1e-5,
approx_kernel_grad=False,
# Normal estimation fallback
estimate_normals_if_missing=True,
normal_knn=64,
normal_drop_threshold_deg=85.0,
)
Returns a MeshResult dataclass with:
.verticesβ (V, 3) float array.facesβ (F, 3) int array.vertex_colorsβ (V, 3) float array, ifcolorswas provided.save(path)β convenience method to write PLY/OBJ/GLB via Trimesh
π Repository layout
nksr-wrapper/
βββ nksr_wrapper/
β βββ __init__.py # public API
β βββ reconstructor.py # NKSRMeshReconstructor + MeshResult
β βββ io.py # load_point_cloud, save_mesh
βββ scripts/
β βββ reconstruct.py # CLI entry point
βββ examples/
β βββ quickstart.py # minimal script
β βββ chunked_reconstruction.py # large-scene example
βββ setup.py
βββ requirements.txt
βββ README.md
π₯οΈ CLI Usage
# Basic reconstruction
python scripts/reconstruct.py scan.ply mesh.ply --detail 1.0
# Large scene (chunked)
python scripts/reconstruct.py huge_scan.ply mesh.ply --chunk-size 50.0
# No normals in file β estimate on-the-fly
python scripts/reconstruct.py scan.ply mesh.ply --estimate-normals
# With per-point colors β colored mesh
python scripts/reconstruct.py scan.ply mesh.ply --colors colors.npy --mise-iter 2
π― Tips & Troubleshooting
| Problem | Solution |
|---|---|
| Mesh is too noisy / has spikes | Lower detail_level (try 0.3) or increase voxel_size |
| Mesh is too smooth / missing fine detail | Raise detail_level (try 1.0) or set mise_iter=2 |
| Out-of-memory on large scans | Use chunk_size=50.0 and chunk_tmp_device="cpu" |
| Mesh is inside-out | Normals are unoriented. Provide sensor_positions or pre-orient normals with Open3D |
| Reconstruction is very slow | You are probably on CPU. NKSR requires CUDA for the custom sparse kernels. |
| PLY file has no normals | Use --estimate-normals or pass sensor_positions to the reconstructor |
π Citation
If you use NKSR in your research, please cite the original paper:
@inproceedings{huang2023nksr,
title={Neural Kernel Surface Reconstruction},
author={Huang, Jiahui and Gojcic, Zan and Atzmon, Matan and
Litany, Or and Fidler, Sanja and Williams, Francis},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2023}
}
Original code: https://github.com/nv-tlabs/NKSR
Pretrained weights: https://huggingface.co/heiwang1997/nksr-checkpoints
π License
This wrapper is released under the MIT License. NKSR itself is under its own license (see the original repository).
Built with β€οΈ on top of NVIDIA t-labs' NKSR.