Mitigating Contamination Effects on Gamma Distribution Parameter Estimation Using Wavelet Shrinkage Techniques
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper uses the Maximum Likelihood Estimation method to investigate the impact of data contamination on the accuracy of parameter estimation for the Gamma distribution. A de-noising approach based on wavelet shrinkage has been proposed to address the limitations posed by contamination. Several types of wavelet functions were employed in combination with different threshold estimation techniques, namely Universal, Minimax, and Stein’s Unbiased Risk Estimate, applying the soft thresholding rule. The study involved simulating data sets generated from the Gamma distribution and analyzing real-life data assumed to follow the same distribution. A specialized program was developed in MATLAB to conduct these simulations and implement both the classical Maximum Likelihood Estimation method and the proposed wavelet-based de-noising techniques. The performance of the parameter estimates was compared using the Mean Squared Error criterion. The findings demonstrated that data contamination significantly affects the accuracy of parameter estimates obtained through the classical Maximum Likelihood Estimation method. In contrast, the proposed wavelet shrinkage method effectively reduced the influence of contamination and enhanced the accuracy of parameter estimation for the Gamma distribution. The study highlights the practical value of integrating wavelet-based denoising techniques into statistical estimation processes, particularly when working with contaminated datasets.