Paper ID | 3D-4.8 | ||
Paper Title | REAL-TIME 3D HAND-OBJECT POSE ESTIMATION FOR MOBILE DEVICES | ||
Authors | Yue Yin, Chris McCarthy, Dana Rezazadegan, Swinburne University of Technology, Australia | ||
Session | 3D-4: 3D Image and Video Processing | ||
Location | Area J | ||
Session Time: | Tuesday, 21 September, 13:30 - 15:00 | ||
Presentation Time: | Tuesday, 21 September, 13:30 - 15:00 | ||
Presentation | Poster | ||
Topic | Three-Dimensional Image and Video Processing: Image and video processing augmented and virtual reality | ||
IEEE Xplore Open Preview | Click here to view in IEEE Xplore | ||
Abstract | Interest in 3D hand pose estimation is rapidly growing, offering the potential for real-time hand gesture recognition in a range of interactive VR/AR applications, and beyond. Most current 3D hand pose estimation models rely on dedicated depth-sensing cameras and/or specialised hardware support to handle both the high computation and memory requirements. However, such requirements hinder the practical application of such models on mobile devices or in other embedded computing contexts. To address this, we propose a lightweight model for hand and object pose estimation specifically targeting mobile applications. Using RGB images only, we show how our approach achieves real-time performance, comparable accuracy, and an 81% model size reduction compared with state-of-the-art methods, thereby supporting the feasibility of the model for deployment on mobile platforms. |