Blood group prediction using fingerprint
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The accurate and rapid identification of blood groups is a fundamental prerequisite for medical interventions, including emergency blood transfusions, organ transplantation, and the management of maternal-fetal incompatibilities. Traditional methods of blood typing, primarily based on serological hemagglutination tests, are invasive, requiring venipuncture or finger-pricking, which poses risks of infection, needle-stick injuries, and patient anxiety. Furthermore, these methods are dependent on chemical reagents and trained medical personnel, limiting their accessibility in resource-constrained environments, remote locations, and mass-casualty scenarios. To address these challenges, this project proposes a novel, non-invasive, computer-aided diagnostic system that predicts blood groups by analyzing dermatoglyphic patterns (fingerprints) using advanced Deep Learning and Ensemble Machine Learning techniques. The research leverages the biological correlation between epidermal ridge patterns and blood antigens, both of which are determined during the intrauterine stage of fetal development. To facilitate robust model training, a large-scale dataset comprising 13,932 fingerprint images was utilized. This dataset covers all eight major blood groups (A+, A-, B+, B-, AB+, AB-, O+, O-) and underwent a rigorous preprocessing pipeline. Images were uniformly resized to 128×128 pixels. The image enhancement framework employs a preliminary Gaussian Blur to remove high-frequency sensor artifacts, Contrast Limited Adaptive Histogram Equalization (CLAHE) to improve local ridge clarity, and a secondary Gaussian Blur blending technique to aggressively sharpen local texture details. The core methodology integrates two distinct feature extraction paradigms: hand-crafted texture analysis and automated spatial feature learning. For texture analysis, Histogram of Oriented Gradients (HOG) and Gabor Filters were utilized to capture ridge orientation and frequency information, which were then classified using optimized Random Forest algorithms. Simultaneously, deep spatial features were automatically extracted and learned using a lightweight Custom Convolutional Neural Network (CNN) trained on enhanced grayscale images, and a pre-trained MobileNetV2 architecture leveraging transfer learning on RGB representations. To overcome the limitations of individual classifiers and maximize predictive performance, a robust Stacking Ensemble Model was engineered. This meta-model fuses the probability outputs of all four base classifiers: the Random Forest (HOG), Random Forest (Gabor), Custom CNN, and MobileNetV2 models. A Logistic Regression meta-learner was employed to optimally aggregate these diverse predictions and determine the final blood group class. Experimental evaluation demonstrated that the proposed Stacked Ensemble achieves a superior classification accuracy of 95.59%, significantly outperforming the standalone base models. The ensemble effectively compensated for individual predictive weaknesses, improving upon the accuracies of the HOG + RandomForest (94.58%), Custom CNN (79.69%), Gabor + RandomForest (76.64%), and MobileNetV2 (55.83%) models. The system successfully minimizes misclassification errors, exhibiting strong precision and recall across the varied blood groups. To demonstrate real-world clinical applicability, the end-to-end pipeline was deployed via a user-friendly web interface developed using Streamlit, enabling real-time fingerprint upload, automated image enhancement, and instant, multi-model blood group prediction. This study validates the potential of dermatoglyphics combined with computational intelligence as a reliable, non-invasive supplementary biometric marker for medical diagnostics.