Efficiency of deep neural networks for joint angle modeling in digital gait assessment

authored by
Javier Conte Alcaraz, Sanam Moghaddamnia, Jürgen Peissig
Abstract

Reliability and user compliance of the applied sensor system are two key issues of digital healthcare and biomedical informatics. For gait assessment applications, accurate joint angle measurements are important. Inertial measurement units (IMUs) have been used in a variety of applications and can also provide significant information on gait kinematics. However, the nonlinear mechanism of human locomotion results in moderate estimation accuracy of the gait kinematics and thus joint angles. To develop “digital twins” as a digital counterpart of body lower limb joint angles, three-dimensional gait kinematic data were collected. This work investigates the estimation accuracy of different neural networks in modeling lower body joint angles in the sagittal plane using the kinematic records of a single IMU attached to the foot. The evaluation results based on the root mean square error (RMSE) show that long short-term memory (LSTM) networks deliver superior performance in nonlinear modeling of the lower limb joint angles compared to other machine learning (ML) approaches. Accordingly, deep learning based on the LSTM architecture is a promising approach in modeling of gait kinematics using a single IMU, and thus can reduce the required physical IMUs attached on the subject and improve the practical application of the sensor system.

Organisation(s)
Institute of Communications Technology
External Organisation(s)
Turk-Alman Universitesi
Type
Article
Journal
Eurasip Journal on Advances in Signal Processing
Volume
2021
No. of pages
20
ISSN
1687-6180
Publication date
08.02.2021
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Signal Processing, Information Systems, Hardware and Architecture, Electrical and Electronic Engineering
Sustainable Development Goals
SDG 3 - Good Health and Well-being
Electronic version(s)
https://doi.org/10.1186/s13634-020-00715-1 (Access: Open)