Papers
arxiv:2006.10214
MediaPipe Hands: On-device Real-time Hand Tracking
Published on Jun 18, 2020
Authors:
Abstract
A framework for real-time hand tracking using a palm detector and hand landmark model on RGB cameras for AR/VR applications.
AI-generated summary
We present a real-time on-device hand tracking pipeline that predicts hand skeleton from single RGB camera for AR/VR applications. The pipeline consists of two models: 1) a palm detector, 2) a hand landmark model. It's implemented via MediaPipe, a framework for building cross-platform ML solutions. The proposed model and pipeline architecture demonstrates real-time inference speed on mobile GPUs and high prediction quality. MediaPipe Hands is open sourced at https://mediapipe.dev.
Get this paper in your agent:
hf papers read 2006.10214 Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash Models citing this paper 3
Datasets citing this paper 0
No dataset linking this paper
Cite arxiv.org/abs/2006.10214 in a dataset README.md to link it from this page.
Spaces citing this paper 0
No Space linking this paper
Cite arxiv.org/abs/2006.10214 in a Space README.md to link it from this page.