The dataset viewer is not available for this split.
Error code: StreamingRowsError
Exception: CastError
Message: Couldn't cast
timestamp: string
latitude: double
longitude: double
altitude: double
covariance[0]: double
covariance[1]: double
covariance[2]: double
covariance[3]: double
covariance[4]: double
covariance[5]: double
covariance[6]: double
covariance[7]: double
covariance[8]: double
image: null
depth: null
point_cloud: null
-- schema metadata --
pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 1873
to
{'image': Image(mode=None, decode=True), 'depth': Image(mode=None, decode=True), 'point_cloud': Image(mode=None, decode=True)}
because column names don't match
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise
return get_rows(
^^^^^^^^^
File "/src/libs/libcommon/src/libcommon/utils.py", line 272, in decorator
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/utils.py", line 77, in get_rows
rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2690, in __iter__
for key, example in ex_iterable:
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2227, in __iter__
for key, pa_table in self._iter_arrow():
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2260, in _iter_arrow
pa_table = cast_table_to_features(pa_table, self.features)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2223, in cast_table_to_features
raise CastError(
datasets.table.CastError: Couldn't cast
timestamp: string
latitude: double
longitude: double
altitude: double
covariance[0]: double
covariance[1]: double
covariance[2]: double
covariance[3]: double
covariance[4]: double
covariance[5]: double
covariance[6]: double
covariance[7]: double
covariance[8]: double
image: null
depth: null
point_cloud: null
-- schema metadata --
pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 1873
to
{'image': Image(mode=None, decode=True), 'depth': Image(mode=None, decode=True), 'point_cloud': Image(mode=None, decode=True)}
because column names don't matchNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
π AgriGS-SLAM Dataset: Orchard Mapping Across Seasons [IEEE RA-L 2026]
β οΈ PREVIEW DEMO β This is a preview demonstration of the AgriGS-SLAM dataset. The complete dataset will be released soon. For updates and project details, visit the project website.
This dataset accompanies the AgriGS-SLAM paper. For the source code and methodology, please visit the main repository.
Citation - IEEE Robotics and Automation Letters (RA-L 2026)
@article{usuelli2026agrigsslam, author={Usuelli, Mirko and Rapado-Rincon, David and Kootstra, Gert and Matteucci, Matteo}, journal={IEEE Robotics and Automation Letters}, title={AgriGS-SLAM: Orchard Mapping Across Seasons via Multi-View Gaussian Splatting SLAM}, year={2026}, volume={11}, number={6}, pages={7102-7109}, keywords={Central Processing Unit;Feedback;Circuits;Electronic circuits;Location awareness;Protocols;Mobile communication;Pixel;Electronic mail;Communication systems;Agricultural automation;SLAM;sensor fusion;field robots;RGB-D perception}, doi={10.1109/LRA.2026.3685453}}
π Overview
The AgriGS-SLAM Dataset provides a comprehensive benchmark for visualβLiDAR SLAM research in agricultural orchard environments. Collected using a mobile field platform equipped with multi-view cameras, LiDAR sensors, and GPS/odometry, this dataset captures the unique challenges of orchard scenes including repetitive row geometry, seasonal appearance changes, and wind-driven foliage motion.
π³ Dataset Description
This dataset was collected in apple and pear orchards across multiple growing seasons:
- Dormancy phase (winter, minimal foliage)
- Flowering phase (spring, blooms)
- Harvesting phase (summer, fruit)
The standardized trajectory protocol enables evaluation of both training-view synthesis and novel-view geometry for robust assessment of SLAM and 3D reconstruction performance.
π Data Structure
βββ train/ # Training split
β βββ groundtruth.csv # SLAM trajectory ground truth
β βββ groundtruth_cam_*.csv # Per-camera pose ground truth
β βββ groundtruth_lidar.csv # LiDAR pose ground truth
β βββ fixposition/
β β βββ odometry/
β β βββ gps/ # GPS/RTK measurements
β β βββ poses/ # LiDAR-based odometry poses
β β βββ twists/ # LiDAR-based odometry velocities
β βββ ouster/
β β βββ points/ # LiDAR point clouds (3D)
β βββ zed_multi/
β βββ cam_1/ # Front-left camera
β β βββ depth/ # Ultra depth maps
β β βββ depth_anything/ # Zero-shot monocular depth estimates
β β βββ point_cloud/ # RGB-D point clouds
β β βββ rgb/ # RGB images
β βββ cam_2/ # Front-center camera
β β βββ [same as cam_1]
β βββ cam_3/ # Front-right camera
β βββ [same as cam_1]
β
βββ val/ # Validation split (same structure as train/)
π§ Sensor Suite
| Sensor | Model | Description | Modality |
|---|---|---|---|
| Multi-View Cameras | ZED X | 3Γ RGB cameras with synchronized stereo pairs | Visual (RGB) |
| Depth Sensors | ZED X Ultra | Ultra depth sensing for stereo depth estimation | Visual (Depth) |
| Monocular Depth | Depth Anything v2 | Zero-shot monocular depth for all images | Visual (Depth) |
| LiDAR | Ouster OS0 | 32-channel mechanical LiDAR | Range (3D Points) |
| GPS/RTK | FixPosition | Real-time kinematic positioning | GNSS |
| Odometry | FixPosition | Dead reckoning fusion | Proprioceptive |
π File Formats
Images
- Format: PNG
- Resolution: Camera-dependent (ZED X: 1920Γ1200 per view)
- Location:
zed_multi/cam_*/rgb/
Depth Data
- Ultra Stereo Depth: 16-bit PNG depth maps (ZED X Ultra)
- Location:
zed_multi/cam_*/depth/
- Location:
- Monocular Depth (Depth Anything): Floating-point estimates
- Location:
zed_multi/cam_*/depth_anything/
- Location:
Point Clouds
- Format: PCD (Point Cloud Data format)
- Type: RGB-D fusion data
- Location:
zed_multi/cam_*/point_cloud/andouster/points/
Trajectories & Poses
- Format: CSV
- Columns (typical):
[timestamp, tx, ty, tz, qx, qy, qz, qw]or velocity equivalents - Locations:
- Ground truth:
groundtruth.csv,groundtruth_cam_*.csv,groundtruth_lidar.csv - Odometry:
fixposition/odometry/poses/,fixposition/odometry/twists/ - LiDAR-based odometry derived from Ouster OS0
- Ground truth:
GPS/RTK Measurements
- Format: CSV
- Location:
fixposition/odometry/gps/
πΎ Data Statistics
- Training Sequences: Multiple trajectories in train/
- Validation Sequences: Multiple trajectories in val/
- Temporal Resolution:
- Cameras: ~5 fps (ZED X synchronized capture)
- LiDAR: 10 Hz (Ouster OS0 standard)
- GPS/Odometry: 10 Hz (Fixposition VIO-RTK-GNSS)
- Seasonal Coverage: Dormancy, Flowering, Harvesting phases
π Related Work
For context on SLAM and 3D Gaussian Splatting in outdoor environments, see the main AgriGS-SLAM repository.
π¨βπΎ Authors
Mirko UsuelliΒΉ* and Matteo MatteucciΒΉ
Dipartimento di Bioingegneria, Elettronica e Informazione, Politecnico di Milano, 20133 Milano, Italy
{mirko.usuelli, matteo.matteucci}@polimi.itDavid Rapado-RinconΒ² and Gert KootstraΒ²
Agricultural Biosystems Engineering, Wageningen University & Research, 6708 PB Wageningen, The Netherlands
{david.rapadorincon, gert.kootstra}@wur.nl
*Corresponding author
π Acknowledgments
The authors thank the Fruit Research Center (FRC) in Randwijk for access to the orchards.
Mirko Usuelli's work was carried out within the Agritech National Research Center and funded by the EU Next-GenerationEU (PNRR β M4C2, Inv. 1.4 β D.D. 1032 17/06/2022, CN00000022). This work reflects only the authors' views; the EU and Commission are not responsible.
Contributions from Matteo Matteucci, Gert Kootstra, and David Rapado-Rincon were co-funded by the EU Digital Europe Programme (AgrifoodTEF, GA NΒΊ 101100622).
π License
This dataset is licensed under the Creative Commons Attribution 4.0 International (CC BY 4.0) license.
You are free to:
- Share and adapt the dataset
- Use it for commercial and non-commercial purposes
Provided you:
- Give appropriate credit to the authors
- Link to the license
- Indicate if changes were made
π Resources & Links
π Follow Us
ποΈ Funding & Institutions
- Downloads last month
- 94




