A Parallel Algorithm for Background Subtraction: Modeling Lognormal Pixel Intensity Distributions on GPUs
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
In this research we address the problem of background subtraction using a parallel algorithm implemented and tested on GPU processors and compare our findings with the state-of-the-art. Our results demonstrate that our algorithm achieves a high number of processed frames per second while maintaining output quality equivalent to leading-edge algorithms. Background subtraction has applications in many areas, ranging from automotive and tool inspection to object detection, recognition, and tracking. While current algorithms commonly rely on mixture of Gaussian models, our algorithm is trained with a small set of sample frames, modeling pixels with lognormal distributions. During the testing phase, we compare pixel intensities with the lognormally distributed values from training with the view to infer a probabilistic score, which classifies each pixel as either foreground or background.