Kullback-Leibler Divergence in Feature Selection: A Methodology for Improved Detection of Heart Valve Disorders
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Dimensionality reduction is crucial for effectively managing high-dimensional datasets, particularly in the healthcare industry. Feature selection identifies the most relevant attributes, reducing computational overhead and ensuring robust performance, especially in resource-constrained environments. This study introduces a Kullback-Leibler Divergence (KLD)-based feature selection method for heart sound analysis aimed at diagnosing valvular heart diseases. Mel Frequency Cepstral Coefficients (MFCC) and Mel spectrograms were extracted from the dataset as input features. KLD was applied to identify the most informative features, which were subsequently validated using various classifiers. This approach resulted in an accuracy of 99% in diagnosing five distinct heart sounds, outperforming the classifiers using the full feature set. The method prioritized critical features, leading to improved performance across all evaluated classifiers. This endeavor also aims to classify heart sounds on an embedded platform, enabling the efficient analysis and accurate diagnosis of cardiovascular conditions. These findings highlight the potential of KLD-based feature selection for the real-time detection of heart valve disorders. By reducing computational complexity while preserving classification accuracy this approach supports the development of efficient, cost-effective edge-based tools, ultimately improving diagnostic precision and healthcare resource efficiency.