Human Joint Angle Estimation with Inertial Sensors and Validation with A Robot Arm

Published In

Biomedical Engineering, IEEE Transactions on

Document Type

Citation

Publication Date

7-2015

Subjects

Inertial navigation, Elbow -- Microelectromechanical systems, Biomechanics

Abstract

Traditionally, human movement has been captured primarily by motion capture systems. These systems are costly, require fixed cameras in a controlled environment, and suffer from occlusion. Recently, the availability of low-cost wearable inertial sensors containing accelerometers, gyroscopes, and magnetometers have provided an alternative means to overcome the limitations of motion capture systems. Wearable inertial sensors can be used anywhere, cannot be occluded, and are low cost. Several groups have described algorithms for tracking human joint angles. We previously described a novel approach based on a kinematic arm model and the Unscented Kalman Filter (UKF). Our proposed method used a minimal sensor configuration with one sensor on each segment. This paper reports significant improvements in both the algorithm and the assessment. The new model incorporates gyroscope and accelerometer random drift models, imposes physical constraints on the range of motion for each joint, and uses zero-velocity updates to mitigate the effect of sensor drift. A highprecision industrial robot arm precisely quantifies the performance of the tracker during slow, normal, and fast movements over continuous 15-min recording durations. The agreement between the estimated angles from our algorithm and the high-precision robot arm reference was excellent. On average, the tracker attained an RMS angle error of about 3° for all six angles. The UKF performed slightly better than the more common Extended Kalman Filter.

Rights

Copyright 2016 IEEE

DOI

10.1109/TBME.2015.2403368

Persistent Identifier

http://archives.pdx.edu/ds/psu/16661

Share

COinS