Enhancing Real-time Heading Estimation for Pedestrian Navigation Via Deep Learning and Smartphone Embeded Sensors
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The accurate smartphone-based pedestrian navigation significantly depends on the precise heading estimation. However, heading estimation is still a challenging problem in most pedestrian navigation applications because of the bias of low-cost smartphone sensors, thermal drift with long-term operation, and the unexpected changes in the carrying mode of handheld devices. To address these challenges, many existing methods based on pervasive resources encounter severe errors. Conversely, auxiliary resources-based approaches may hinder ubiquitous and seamless indoor-outdoor navigation experiences. This research aims to enhance heading estimation by leveraging pervasive measurements such as LVGOs and straight-line features self-recognized from camera images. The proposed method mitigates the accumulated gyro drift using the absolute heading angle estimated by LVGOs. However, these absolute angles are highly prone to false estimation while navigating near areas with high electric and magnetic activities due to stable geomagnetism anomalies. Encouraged by the pervasiveness of straight-line features in indoor and outdoor environments, we developed a deep learning method-based visual tracking of these features to enhance the gyroscope and magnetic field fusion-based heading estimation. A convolutional neural network was developed using a U-Net network to accurately and quickly recognize these features, then leverage them as heading constraint to overcome long-term gyro drift and short-term compass heading bias. The proposed method superiorly ensured the balance between recognition time delay and precision, which enabled smooth real-time performance. The achieved results improved the heading estimation and could provide significant help, especially for visually impaired people, as they mostly track tactile paving. This encourages future tests and assessments using visually impaired people to reliably include the proposed method in their applications.