Unimore logo AImageLab

Unsupervised Detection of Dynamic Hand Gestures from Leap Motion Data

Abstract: The effective and reliable detection and classification of dynamic hand gestures is a key element for building Natural User Interfaces, systems that allow the users to interact using free movements of their body instead of traditional mechanical tools. However, methods that temporally segment and classify dynamic gestures usually rely on a great amount of labeled data, including annotations regarding the class and the temporal segmentation of each gesture. In this paper, we propose an unsupervised approach to train a Transformer-based architecture that learns to detect dynamic hand gestures in a continuous temporal sequence. The input data is represented by the 3D position of the hand joints, along with their speed and acceleration, collected through a Leap Motion device. Experimental results show a promising accuracy on both the detection and the classification task and that only limited computational power is required, confirming that the proposed method can be applied in real-world applications.


Citation:

D'Eusanio, A.; Pini, S.; Borghi, G.; Simoni, A.; Vezzani, R. "Unsupervised Detection of Dynamic Hand Gestures from Leap Motion Data" Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 13231, ita, pp. 414 -424 , 2022, 2022 DOI: 10.1007/978-3-031-06427-2_35

 not available