File size: 3,236 Bytes
a4f65fc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
---
license: mit
tags:
- Tactile Sensing
- Vibro-Acoustic Sensing
- Contact Localization
- Deep Learning
pretty_name: Vibrosense Dataset
size_categories:
- n>1T
---
# Vibro-Sense: Robust Vibration-based Impulse Response Localization and Trajectory Tracking for Robotic Hands
### Paper page can be found [here](https://wzaielamri.github.io/publication/vibrosense).
### Github Repository can be found [here](https://github.com/wzaielamri/vibrosense).


# Vibrosense.



This repository provides the full dataset for the Vibro-Sense project, enabling robust, affordable, and scalable contact perception for robotic hands using vibro-acoustic sensing. Our approach leverages seven low-cost piezoelectric microphones and an Audio Spectrogram Transformer to decode vibrational signatures generated during physical interaction. The system achieves high-accuracy whole-body touch localization, robust trajectory tracking, and material-aware perception, and is resilient to robot motion.

**Note:** The dataset is generated and stored using the [webdataset](https://github.com/webdataset/webdataset) format for efficient large-scale data handling and streaming.

## Dataset Structure

The dataset is organized into several main directories:

- `localisation/`: Contains data for touch localization and material detection tasks, including filtered and processed signals for different materials (soft-plastic (cap), metal, plastic, wood).
- `quickdraw/`: Contains data for trajectory following task, with both fixed and moving hand conditions, and for different materials.
- `quickdraw_testset/`: Test set for trajectory following, organized similarly to `quickdraw/`.
- `objects/`: Contains object classification data collected with three microphones.

## Tasks

This dataset supports four main tasks:

1. **Touch Localization**
  - Predict the contact location on the robotic hand using vibrational signals.
  - Data: `localisation/filtered/` and `localisation/processed_*` directories.

2. **Trajectory Following (Quickdraw)**
  - Track and reconstruct the trajectory of contact points during dynamic interactions.
  - Data: `quickdraw/` and `quickdraw_testset/` directories.

3. **Object Classification**
  - Classify the object being touched based on the vibrational response.
  - Data: `objects/dataset/`.

4. **Material Detection (using Localisation Dataset)**
  - Identify the material of the contact surface (e.g., metal, plastic, wood, soft-plastic (cap)) using the localisation dataset.
  - Data: `localisation/filtered/` and `localisation/processed_*` directories.

## Getting Started

All datasets are provided in processed form, ready for training and evaluation. Please refer to the [paper](https://wzaielamri.github.io/publication/vibrosense) and [code repository](https://github.com/wzaielamri/vibrosense) for details on model training and evaluation.

## Citation

```bibtex
@InProceedings{ZaiElAmri2026VibroSense,
  author = {Zai El Amri, Wadhah and {Navarro-Guerrero}, Nicol{\'a}s},
  title = {"Vibro-Sense: Robust Vibration-based Impulse Response Localization and Trajectory Tracking for Robotic Hands"},
  booktitle = {"ArXiv Preprint arXiv:2601.20555, Submitted to Autonomous Robots Springer Journal"},
  year={2026},
}
```