VRGestures: Controller and Hand Gesture Datasets for Virtual Reality

Authors
G. T. Papadopoulos
A. Doumanoglou
D. Zarpalas
Year
2023
Venue
CGI2023, Shanghai, China
Download

Abstract

Gesture Recognition is attracting increasingly more attention over the years and has been adopted in main applications in the real world and the Virtual one. New generation Virtual Reality (VR) headsets like the Meta Quest 2 support hand tracking very efficiently and are challenging the research community for more breakthrough discoveries in Hand Gesture Recognition. What has also been quietly improved recently are the VR controllers, which have become wireless and also more practical to use. However, when it comes to VR gesture datasets, and especially controller gesture datasets there are limited data available. Point-And-Click methods are widely accepted, which is why gestures are being neglected, combined with the shortage of available datasets. To address this gap we provide two datasets one with controller gestures and one with hand gestures, capable of recording with either controller or hand and even with both hands simultaneously. We created two VR applications to record for controllers and hands the position and the orientation and also each timestamp that we record data. Then we trained off-the-shelf time series classifiers to test our data, export metrics, and compare different subsets of our datasets between them. Hand gesture recognition is far more complicated than controller gesture recognition as we take almost thrice input and the difference is being analyzed and discussed with findings and metrics. The datasets are available online: https://doi.org/10.5281/zenodo.8027807