26 juillet 2025
Ce document est lié à :
info:eu-repo/semantics/altIdentifier/doi/10.54941/ahfeXXXX
http://creativecommons.org/licenses/by/ , info:eu-repo/semantics/OpenAccess
Hasnaa Belabzioui et al., « Impact of introducing sparse inertial Measurement Units in Computer Vision-Based Motion Capture Systems for Ergonomic Postural Assessment », HAL SHS (Sciences de l’Homme et de la Société), ID : 10.54941/ahfeXXXX
In ergonomics, worker movement on site is an important factor in assessing the risk of musculoskeletal disorders, among other factors. Several commercial markerless motion capture systems that can be used for this purpose are available, mostly based on monocular or multi RGB (THEIA system) / RGB-D cameras (MS Kinect system). Hybrid systems combining computer vision and Inertial Measurement Units (IMUs) have been introduced, such as the KIMEA (1 RGB-D+4 IMUs) and the KIMEA Cloud (1 RGB+4 IMUs) solutions. Although previous works analysed the accuracy of some of these systems, the relevance of coupling computer vision and IMU has not been studied. Hence, we tested the performance of these systems in evaluating bimanual handling tasks, with partial occlusions of the body in the images. The THEIA system exhibits an average of 11.1° error for all the joints, with larger Root Mean Square errors on the wrists and the shoulder (>14° error). KIMEA Cloud with IMU obtained similar global RMS error (10.3 ° to 10.9° depending on the viewpoint), but with better results for the wrists (3.9 ° to 4.3°). The impact of coupling RGB-D images and IMU data is even bigger: the RMS error of the Kinect decreased from 17.2° down to 8.9° when adding the IMUs information (KIMEA system). This difference is even bigger for the wrists: 28.3° to 38.5° for the Kinect, and 3.8 ° to 4° for KIMEA. These results confirm the advantage of introducing a few IMU sensors, especially for the wrists which are badly tracked in the images