Enhancing Pedestrian Trajectory Prediction through Multi-feature Fusion Graph Convolutional Networks
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper introduces a multi-feature fusion graph convolutional network (MFFGCN) for the accurate prediction of pedestrian trajectories. Unlike previous methods that primarily relied on distance-based interactions, MFFGCN considers a broader range of influencing factors, including movement direction, relative distance, and movement velocity. By integrating these factors, the proposed model is able to more comprehensively capture the complex social interactions among pedestrians. The model comprises several modules: a pedestrian movement direction interaction module, distance-based and velocity-based interaction graph modules, a graph convolutional network (GCN) for feature extraction, and a temporal convolutional network (TCN) for trajectory prediction. Experimental results on the ETH and UCY datasets demonstrate that MFFGCN outperforms state-of-the-art methods, achieving lower average displacement error (ADE) and final displacement error (FDE). The proposed model not only improves the accuracy of pedestrian trajectory prediction but also provides insights into the importance of considering multiple factors in modeling pedestrian interactions. Our source code is available at: https://github.com/XiaojingChen327/MFFGCN-model/tree/main.