Dataset Viewer
Auto-converted to Parquet Duplicate
Search is not available for this dataset
main
array 2D
[ [ 128, 137, 141, 147, 143, 145, 140, 144, 140, 138, 143, 132, 138, 136, 134, 133, 134, 131, 132, 137, 142, 135, 139, 146, 139, 148, 147, 139, 149, 155, 143, 141, 140, 143, 147,...
[[147,146,145,144,144,150,149,152,148,142,141,144,137,132,133,134,130,134,139,140,135,135,131,139,13(...TRUNCATED)
[[160,154,150,154,149,151,149,148,148,153,146,144,135,140,129,134,140,132,129,136,137,137,134,137,13(...TRUNCATED)
[[148,145,151,153,151,153,144,153,148,150,143,142,134,137,137,149,139,124,145,149,139,134,143,144,13(...TRUNCATED)
[[156,156,153,150,151,159,152,155,153,153,144,134,136,141,131,129,141,135,133,146,135,137,135,138,13(...TRUNCATED)
[[156,157,156,155,156,144,158,149,154,153,143,139,145,139,135,140,144,138,140,152,149,139,133,131,13(...TRUNCATED)
[[161,167,160,163,156,150,153,152,156,154,147,139,146,141,139,139,145,147,141,147,151,139,140,139,13(...TRUNCATED)
[[177,172,168,166,160,157,161,153,145,147,149,134,142,144,149,155,154,148,148,141,146,145,147,145,14(...TRUNCATED)
[[161,177,171,164,165,159,167,160,146,150,155,142,144,138,150,156,156,159,156,144,144,143,151,148,15(...TRUNCATED)
[[168,170,167,169,167,160,162,169,156,144,141,146,148,141,149,151,157,161,154,141,146,145,144,145,15(...TRUNCATED)
End of preview. Expand in Data Studio

NucMM-M Dataset (Mouse)

3D Neuronal Nuclei Segmentation in Mouse Brain Electron Microscopy.

Dataset Structure

├── Image/
│   ├── train/
│   │   ├── img_000_200_200.h5
│   │   ├── img_000_604_576.h5
│   │   ├── img_508_200_576.h5
│   │   └── img_508_604_200.h5
│   └── val/
│       ├── img_000_200_576.h5
│       ├── img_000_604_200.h5
│       ├── img_508_200_200.h5
│       └── img_508_604_576.h5
└── Label/
    ├── train/
    │   └── seg_*.h5
    └── val/
        └── seg_*.h5

Label Information

  • Original: Instance segmentation (each nucleus has unique ID)
  • Classes: Background (0) + Nuclei instances (1, 2, 3, ...)

Related Datasets

Usage

from huggingface_hub import hf_hub_download
import h5py

# Download a volume
local_path = hf_hub_download(
    repo_id="Angelou0516/NucMM-M",
    filename="Image/train/img_000_200_200.h5",
    repo_type="dataset"
)

# Load with h5py
with h5py.File(local_path, 'r') as f:
    volume = f[list(f.keys())[0]][:]
    print(volume.shape)  # (200, 200, 200)

Citation

If you use this dataset, please cite the NucMM paper:

@inproceedings{lin2021nucmm,
  title={NucMM Dataset: 3D Neuronal Nuclei Instance Segmentation at Sub-Cubic Millimeter Scale},
  author={Lin, Zudi and Wei, Donglai and Lichtman, Jeff and Bhanu, Bir and Bhattacharjee, Sumit and Bhavsar, Khushali and Bhosale, Pratik and Bhunre, Sandhya and others},
  booktitle={MICCAI},
  year={2021}
}
Downloads last month
19