Real Time AI Body Tracking Estimations for Digital Twins for an Immersive Environment

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper introduces a trackerless full-body motion tracking system that combines an improved version of Google MediaPipe Pose with Unity for real-time avatar animation and interaction. The system eliminates the need for external hardware, such as VR controllers or wearable trackers, enabling cost-effective and accessible motion tracking using only RGB cameras. Performance was evaluated under controlled conditions across simulated low-end, mid-range, and high-end hardware configurations. The high-end setup achieved optimal results, maintaining a frame rate (FPS) of 55–60, latency within 30–50 ms, and a mean squared error (MSE) of 0.008. The mid-range setup provided reliable performance for general applications, while the simulated low-end configuration revealed challenges in latency and accuracy, emphasizing hardware dependency. By integrating AI-driven pose estimation with Unity’s physics engine, the system facilitates natural, real-time interactions within virtual environments. This approach removes traditional hardware barriers, improving immersion and scalability for applications such as telepresence, virtual training, and gaming. The research highlights the potential for expanding motion tracking technology to more inclusive and versatile applications, with future work focusing on optimization for entry-level hardware and scalability in multi-user environments.

Article activity feed