Design and Calibration Method of Low-Cost Full-Body Tracking System Based on Multimodal Fusion

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper introduces a full-body human motion tracking system that integrates RGB video and inertial sensors at low cost. The system addresses key limitations of conventional methods, including high equipment costs, restricted mobility, and sensor drift during fast movement. A lightweight fusion model combines video and IMU signals using attention and joint constraints. A calibration process aligns the IMU and camera with minimal user effort. Ten participants performed standard movements such as walking, squatting, and arm lifting in a controlled indoor setup. The proposed system reached a mean root mean square (RMS) joint error of 18.0 mm. Compared with tracking based on IMUs alone, this method reduced the error by 31.6% (p < 0.01). The system remained stable under moderate occlusion. Across repeated trials, the average variation was less than 2.5 mm. These results indicate that accurate, repeatable motion tracking is possible without expensive hardware. The system can be applied in areas such as rehabilitation, sports analysis, and virtual environments.

Article activity feed