VadExylos commited on
Commit
bd56113
·
verified ·
1 Parent(s): a8b8df3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +111 -3
README.md CHANGED
@@ -1,3 +1,111 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: cc-by-nc-4.0
5
+ pretty_name: Exylos Pick & Place Sample
6
+ task_categories:
7
+ - robotics
8
+ size_categories:
9
+ - n<1K
10
+ tags:
11
+ - lerobot
12
+ - robot-learning
13
+ - imitation-learning
14
+ - manipulation
15
+ - pick-and-place
16
+ - multi-view
17
+ - phase-annotations
18
+ - failure-recovery
19
+ - panda
20
+ ---
21
+
22
+ # Exylos Pick & Place Sample
23
+
24
+ ## Dataset Summary
25
+
26
+ Exylos Pick & Place Sample is a compact multi-view robotics dataset in LeRobot-style structure for a single manipulation task: pick up an object and place it into a container.
27
+
28
+ This public release is intended as an inspection-friendly sample of the dataset format, modalities, and annotation schema rather than a large-scale benchmark.
29
+
30
+ ### Highlights
31
+
32
+ - 50 episodes
33
+ - 1 task
34
+ - 5 synchronized RGB camera views per episode
35
+ - 30 FPS H.264 videos
36
+ - 9D robot state
37
+ - 9D action vectors
38
+ - episode-level outcome metadata
39
+ - phase-level annotations
40
+ - success, failure, and recovery-rich trajectories
41
+ - LeRobot-style organization for downstream policy learning and evaluation
42
+
43
+ ## Task
44
+
45
+ The sample covers one manipulation task:
46
+
47
+ **Pick up an object from the workspace and place it into a container.**
48
+
49
+ ## What is included
50
+
51
+ Each episode combines:
52
+
53
+ - robot state trajectories
54
+ - action trajectories
55
+ - synchronized multi-view RGB videos
56
+ - episode-level success and failure metadata
57
+ - phase-level annotations
58
+ - derived quality metrics
59
+
60
+ ### Camera views
61
+
62
+ The dataset includes five RGB video streams:
63
+
64
+ - `observation.images.wrist_cam`
65
+ - `observation.images.front_cam`
66
+ - `observation.images.left_cam`
67
+ - `observation.images.top_cam`
68
+ - `observation.images.right_cam`
69
+
70
+ ### Core trajectory fields
71
+
72
+ Main fields include:
73
+
74
+ - `observation.state`
75
+ - `action`
76
+ - `timestamp`
77
+ - `frame_index`
78
+ - `episode_index`
79
+ - `task_index`
80
+ - `next.done`
81
+ - `next.success`
82
+
83
+ ## Repository Structure
84
+
85
+ ```text
86
+ README.md
87
+ LICENSE
88
+ info.json
89
+ annotations.json
90
+ tasks.jsonl
91
+ episodes.jsonl
92
+ episodes_stats.jsonl
93
+ data/
94
+ chunk-000/
95
+ episode_000000.parquet
96
+ episode_000001.parquet
97
+ ...
98
+ videos/
99
+ chunk-000/
100
+ wrist_cam/
101
+ episode_000000.mp4
102
+ episode_000001.mp4
103
+ ...
104
+ front_cam/
105
+ ...
106
+ left_cam/
107
+ ...
108
+ top_cam/
109
+ ...
110
+ right_cam/
111
+ ...