Dataset Viewer
Duplicate
The dataset viewer is not available for this split.
Cannot load the dataset split (in streaming mode) to extract the first rows.
Error code:   StreamingRowsError
Exception:    CastError
Message:      Couldn't cast
timestamp: string
latitude: double
longitude: double
altitude: double
covariance[0]: double
covariance[1]: double
covariance[2]: double
covariance[3]: double
covariance[4]: double
covariance[5]: double
covariance[6]: double
covariance[7]: double
covariance[8]: double
image: null
depth: null
point_cloud: null
-- schema metadata --
pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 1873
to
{'image': Image(mode=None, decode=True), 'depth': Image(mode=None, decode=True), 'point_cloud': Image(mode=None, decode=True)}
because column names don't match
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise
                  return get_rows(
                         ^^^^^^^^^
                File "/src/libs/libcommon/src/libcommon/utils.py", line 272, in decorator
                  return func(*args, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/utils.py", line 77, in get_rows
                  rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2690, in __iter__
                  for key, example in ex_iterable:
                                      ^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2227, in __iter__
                  for key, pa_table in self._iter_arrow():
                                       ^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2260, in _iter_arrow
                  pa_table = cast_table_to_features(pa_table, self.features)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2223, in cast_table_to_features
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              timestamp: string
              latitude: double
              longitude: double
              altitude: double
              covariance[0]: double
              covariance[1]: double
              covariance[2]: double
              covariance[3]: double
              covariance[4]: double
              covariance[5]: double
              covariance[6]: double
              covariance[7]: double
              covariance[8]: double
              image: null
              depth: null
              point_cloud: null
              -- schema metadata --
              pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 1873
              to
              {'image': Image(mode=None, decode=True), 'depth': Image(mode=None, decode=True), 'point_cloud': Image(mode=None, decode=True)}
              because column names don't match

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

🚜 AgriGS-SLAM Dataset: Orchard Mapping Across Seasons [IEEE RA-L 2026]

IEEE RAL GitHub ArXiv Website

⚠️ PREVIEW DEMO β€” This is a preview demonstration of the AgriGS-SLAM dataset. The complete dataset will be released soon. For updates and project details, visit the project website.

This dataset accompanies the AgriGS-SLAM paper. For the source code and methodology, please visit the main repository.

Citation - IEEE Robotics and Automation Letters (RA-L 2026)

@article{usuelli2026agrigsslam,
    author={Usuelli, Mirko and Rapado-Rincon, David and Kootstra, Gert and Matteucci, Matteo},
    journal={IEEE Robotics and Automation Letters}, 
    title={AgriGS-SLAM: Orchard Mapping Across Seasons via Multi-View Gaussian Splatting SLAM}, 
    year={2026},
    volume={11},
    number={6},
    pages={7102-7109},
    keywords={Central Processing Unit;Feedback;Circuits;Electronic circuits;Location awareness;Protocols;Mobile communication;Pixel;Electronic mail;Communication systems;Agricultural automation;SLAM;sensor fusion;field robots;RGB-D perception},
    doi={10.1109/LRA.2026.3685453}}

🍎 Overview

The AgriGS-SLAM Dataset provides a comprehensive benchmark for visual–LiDAR SLAM research in agricultural orchard environments. Collected using a mobile field platform equipped with multi-view cameras, LiDAR sensors, and GPS/odometry, this dataset captures the unique challenges of orchard scenes including repetitive row geometry, seasonal appearance changes, and wind-driven foliage motion.

🌳 Dataset Description

This dataset was collected in apple and pear orchards across multiple growing seasons:

  • Dormancy phase (winter, minimal foliage)
  • Flowering phase (spring, blooms)
  • Harvesting phase (summer, fruit)

The standardized trajectory protocol enables evaluation of both training-view synthesis and novel-view geometry for robust assessment of SLAM and 3D reconstruction performance.

πŸ“Š Data Structure

β”œβ”€β”€ train/                          # Training split
β”‚   β”œβ”€β”€ groundtruth.csv             # SLAM trajectory ground truth
β”‚   β”œβ”€β”€ groundtruth_cam_*.csv       # Per-camera pose ground truth
β”‚   β”œβ”€β”€ groundtruth_lidar.csv       # LiDAR pose ground truth
β”‚   β”œβ”€β”€ fixposition/
β”‚   β”‚   └── odometry/
β”‚   β”‚       β”œβ”€β”€ gps/                # GPS/RTK measurements
β”‚   β”‚       β”œβ”€β”€ poses/              # LiDAR-based odometry poses
β”‚   β”‚       └── twists/             # LiDAR-based odometry velocities
β”‚   β”œβ”€β”€ ouster/
β”‚   β”‚   └── points/                 # LiDAR point clouds (3D)
β”‚   └── zed_multi/
β”‚       β”œβ”€β”€ cam_1/                  # Front-left camera
β”‚       β”‚   β”œβ”€β”€ depth/              # Ultra depth maps
β”‚       β”‚   β”œβ”€β”€ depth_anything/     # Zero-shot monocular depth estimates
β”‚       β”‚   β”œβ”€β”€ point_cloud/        # RGB-D point clouds
β”‚       β”‚   └── rgb/                # RGB images
β”‚       β”œβ”€β”€ cam_2/                  # Front-center camera
β”‚       β”‚   └── [same as cam_1]
β”‚       └── cam_3/                  # Front-right camera
β”‚           └── [same as cam_1]
β”‚
└── val/                            # Validation split (same structure as train/)

πŸ”§ Sensor Suite

Sensor Model Description Modality
Multi-View Cameras ZED X 3Γ— RGB cameras with synchronized stereo pairs Visual (RGB)
Depth Sensors ZED X Ultra Ultra depth sensing for stereo depth estimation Visual (Depth)
Monocular Depth Depth Anything v2 Zero-shot monocular depth for all images Visual (Depth)
LiDAR Ouster OS0 32-channel mechanical LiDAR Range (3D Points)
GPS/RTK FixPosition Real-time kinematic positioning GNSS
Odometry FixPosition Dead reckoning fusion Proprioceptive

πŸ“‹ File Formats

Images

  • Format: PNG
  • Resolution: Camera-dependent (ZED X: 1920Γ—1200 per view)
  • Location: zed_multi/cam_*/rgb/

Depth Data

  • Ultra Stereo Depth: 16-bit PNG depth maps (ZED X Ultra)
    • Location: zed_multi/cam_*/depth/
  • Monocular Depth (Depth Anything): Floating-point estimates
    • Location: zed_multi/cam_*/depth_anything/

Point Clouds

  • Format: PCD (Point Cloud Data format)
  • Type: RGB-D fusion data
  • Location: zed_multi/cam_*/point_cloud/ and ouster/points/

Trajectories & Poses

  • Format: CSV
  • Columns (typical): [timestamp, tx, ty, tz, qx, qy, qz, qw] or velocity equivalents
  • Locations:
    • Ground truth: groundtruth.csv, groundtruth_cam_*.csv, groundtruth_lidar.csv
    • Odometry: fixposition/odometry/poses/, fixposition/odometry/twists/
    • LiDAR-based odometry derived from Ouster OS0

GPS/RTK Measurements

  • Format: CSV
  • Location: fixposition/odometry/gps/

πŸ’Ύ Data Statistics

  • Training Sequences: Multiple trajectories in train/
  • Validation Sequences: Multiple trajectories in val/
  • Temporal Resolution:
    • Cameras: ~5 fps (ZED X synchronized capture)
    • LiDAR: 10 Hz (Ouster OS0 standard)
    • GPS/Odometry: 10 Hz (Fixposition VIO-RTK-GNSS)
  • Seasonal Coverage: Dormancy, Flowering, Harvesting phases

πŸ“š Related Work

For context on SLAM and 3D Gaussian Splatting in outdoor environments, see the main AgriGS-SLAM repository.

πŸ‘¨β€πŸŒΎ Authors

  • Mirko UsuelliΒΉ* and Matteo MatteucciΒΉ
    Dipartimento di Bioingegneria, Elettronica e Informazione, Politecnico di Milano, 20133 Milano, Italy
    {mirko.usuelli, matteo.matteucci}@polimi.it

  • David Rapado-RinconΒ² and Gert KootstraΒ²
    Agricultural Biosystems Engineering, Wageningen University & Research, 6708 PB Wageningen, The Netherlands
    {david.rapadorincon, gert.kootstra}@wur.nl

*Corresponding author

πŸ™ Acknowledgments

The authors thank the Fruit Research Center (FRC) in Randwijk for access to the orchards.

Mirko Usuelli's work was carried out within the Agritech National Research Center and funded by the EU Next-GenerationEU (PNRR – M4C2, Inv. 1.4 – D.D. 1032 17/06/2022, CN00000022). This work reflects only the authors' views; the EU and Commission are not responsible.

Contributions from Matteo Matteucci, Gert Kootstra, and David Rapado-Rincon were co-funded by the EU Digital Europe Programme (AgrifoodTEF, GA NΒΊ 101100622).

πŸ“ License

This dataset is licensed under the Creative Commons Attribution 4.0 International (CC BY 4.0) license.

You are free to:

  • Share and adapt the dataset
  • Use it for commercial and non-commercial purposes

Provided you:

  • Give appropriate credit to the authors
  • Link to the license
  • Indicate if changes were made

πŸ”— Resources & Links

🌐 Follow Us

AIRLab Website LinkedIn Instagram

πŸ›οΈ Funding & Institutions

Agritech Center Next Generation EU AgrifoodTEF Politecnico di Milano Wageningen University
Agritech Next Generation EU AgrifoodTEF PoliMi WUR

AIRLab Logo

Downloads last month
94

Paper for foxmirko/agri-gs-slam-dataset-demo