Skeleton Tracking from Kinect/WIMUs – v1

A novel multi-sensor fusion method to build a human skeleton is presented. We propose to fuse the joint position information obtained from the Kinect sensor with more precise estimation of body segment orientations provided by a small (9) number of wearable inertial sensors. The use of inertial sensors can help to address many of the well known limitations of the Kinect sensor. The precise calculation
of joint angles potentially allows the quantification of movement errors in technique training, thus facilitating the use of the low-cost Kinect sensor for accurate biomechanical purposes e.g. the improved human skeleton could be used in visual feedback-guided motor learning, for example. We compare our system to the gold standard Vicon optical motion capture system, proving that the fused skeleton achieves a very high level of accuracy.