Gesture Tracking Using on MEMS Inertial Sensor and Low Resolution Imaging Sensor

1D. Sridhar Raja, R. Abinethri and B. Kalaiselvi

214 Views
67 Downloads
Abstract:

In this paper, we present an algorithm for hand gesture tracking and recognition based on the integration of a custom-built microelectromechanical systems (MEMS)-based inertial sensor (or measurement unit) and a low resolution imaging (i.e., vision) sensor. We discuss the 2-D gesture recognition and tracking results here, but the algorithm can be extended to 3-D motion tracking and gesture recognition in the future. Essentially, this paper shows that inertial data sampled at 100 Hz and vision data at 5 frames/s could be fused by an extended Kalman filter, and used for accurate human hand gesture recognition and tracking. Since an inertial sensor is better at tracking rapid movements, while a vision sensor is more stable and accurate for tracking slow movements, a novel adaptive algorithm has been developed to adjust measurement noise covariance according to the measured accelerations and the angular rotation rates. The experimental results verify that the proposed method is capable of reducing the velocity error and position drift in an MEMS-based inertial sensor when aided by the vision sensor. Compensating for the time delay due to the visual data processing cycles, a moving average filter is applied to remove the high frequency noise and propagate the inertial signals. The reconstructed trajectories of the first 10 Arabic numerals are further recognized using dynamic time warping with a direct cosine transform for feature extraction, resulting in an accuracy of 92.3% and individual numeral recognition within 100 ms.

Keywords:

Imaging Sensor, Inertial Sensor, Gesture Tracking.

Paper Details
Month7
Year2018
Volume22
IssueIssue 4
Pages148-156