anfera236/HHDC-2m
Updated • 1
cube array 3D | filename stringlengths 15 15 |
|---|---|
[[[0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,3.0,16.0,34.0,47.0,54.0,66.0,73.0,80.0,79.0,6(...TRUNCATED) | 0000/000001.npz |
[[[33.0,9.0,5.0,3.0,7.0,9.0,16.0,24.0,44.0,51.0,50.0,66.0,55.0,47.0,36.0,28.0,12.0,8.0,11.0,10.0,9.0(...TRUNCATED) | 0000/000002.npz |
[[[0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0(...TRUNCATED) | 0000/000003.npz |
[[[0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,2.0,1.0,1.0,1.0,2.0,3.0,2.0,2.0,1.0,2.0,4.0,2.0,3(...TRUNCATED) | 0000/000004.npz |
[[[0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0(...TRUNCATED) | 0000/000005.npz |
[[[90.0,89.0,86.0,85.0,86.0,86.0,81.0,84.0,86.0,82.0,87.0,86.0,86.0,85.0,87.0,83.0,93.0,90.0,90.0,92(...TRUNCATED) | 0000/000006.npz |
[[[0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,2.0,2.0,0.0,0.0,0.0,0.0,0(...TRUNCATED) | 0000/000007.npz |
[[[0.0,0.0,7.0,30.0,36.0,44.0,51.0,54.0,56.0,58.0,61.0,58.0,63.0,58.0,58.0,62.0,64.0,65.0,66.0,59.0,(...TRUNCATED) | 0000/000008.npz |
[[[0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0(...TRUNCATED) | 0000/000009.npz |
[[[2.0,3.0,2.0,0.0,1.0,3.0,3.0,4.0,7.0,6.0,9.0,7.0,5.0,6.0,3.0,3.0,3.0,1.0,3.0,2.0,3.0,5.0,4.0,3.0,3(...TRUNCATED) | 0000/000010.npz |
hhdc/cube_generator.py).hhdc/forward_model.py) that emulates the Concurrent Artificially-intelligent Spectrometry and Adaptive Lidar System (CASALS), applying Gaussian beam aggregation, distance-based photon loss, and mixed Poisson + Gaussian noise to downsample/perturb the cube.load_dataset("anfera236/HHDC", split=...).train, validation, test (see dataset_info for exact sizes).Each sample in this Hugging Face dataset contains:
cube — float32, shape [128, 48, 48][bins, H, W]), derived from NEON discrete-return LiDAR using the HHDC-Creator pipeline.filename — stringAdditional fields produced by the HHDC-Creator pipeline (e.g. x_centers, y_centers, bin_edges, footprint_counts, metadata) are not stored in this HF dataset. They can be regenerated from NEON AOP LiDAR using the code in the HHDC-Creator repository.
With the default cube configuration (e.g. cube_config_sample.json, cube_length = 96 m, footprint_separation = 2 m):
cube): [128, 48, 48] Low-resolution, noisy measurements are generated on the fly using the physics-based forward model (LidarForwardImagingModel in HHDC-Creator). For example, with output_res_m=(3.0, 6.0):
[128, 32, 16]Users are expected to:
cube from this dataset as the clean target.If you want to replicate our exact results, you can use the reference cube provided at SampleCube/gt2.npz.
from datasets import load_dataset
import torch
from hhdc.forward_model import LidarForwardImagingModel # or your actual import path (check scripts folder in this repo)
# Load dataset
ds = load_dataset("anfera236/HHDC", split="train")
ds.set_format(type="torch", columns=["cube"])
# Instantiate the LiDAR forward model (use your actual parameters)
forward_model = LidarForwardImagingModel(
output_res_m=(3.0, 6.0),
footprint_diameter_m=10.0,
b=0.1, # set to zero for no background photons
eta=0.5, # set to zero for no Gaussian noise
ref_altitude=500.0,
ref_photon_count=20.0,
)
sample = ds[0]
# High-res “clean” HHDC: [bins, H, W]
clean = sample["cube"]
# Low-res noisy measurement generated by the forward model: [bins, H_low, W_low]
noisy = forward_model(clean)
# Example: train a denoising/super-res model (my_model: noisy -> clean)
pred = my_model(noisy.unsqueeze(0)) # [1, bins, H, W] ideally
loss = loss_fn(pred, clean.unsqueeze(0)) # shapes must match
loss.backward()
hhdc.canopy_plots.create_chm in the HHDC-Creator repo).@article{ramirez2024hyperheight,
title={Hyperheight lidar compressive sampling and machine learning reconstruction of forested landscapes},
author={Ramirez-Jaime, Andres and Pena-Pena, Karelia and Arce, Gonzalo R and Harding, David and Stephen, Mark and MacKinnon, James},
journal={IEEE Transactions on Geoscience and Remote Sensing},
volume={62},
pages={1--16},
year={2024},
publisher={IEEE}
}
@article{ramirez2025super,
title={Super-Resolved 3D Satellite Lidar Imaging of Earth Via Generative Diffusion Models},
author={Ramirez-Jaime, Andres and Porras-Diaz, Nestor and Arce, Gonzalo R and Stephen, Mark},
journal={IEEE Transactions on Geoscience and Remote Sensing},
year={2025},
publisher={IEEE}
}
@inproceedings{ramirez2025denoising,
title={Denoising and Super-Resolution of Satellite Lidars Using Diffusion Generative Models},
author={Ramirez-Jaime, Andres and Porras-Diaz, Nestor and Arce, Gonzalo R and Stephen, Mark},
booktitle={2025 IEEE Statistical Signal Processing Workshop (SSP)},
pages={1--5},
year={2025},
organization={IEEE}
}