Compliance-Aware and Explainable GA-Optimized Neural Network for Cost Estimation in Safety-Critical Medical Software
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Accurate cost estimation is vital for medical software projects due to their safety-critical nature and strict compliance requirements. Traditional algorithmic models fail to capture nonlinear cost-driver interactions, while black-box machine learning approaches lack interpretability, limiting adoption in regulated contexts. To address this, we propose GA-BP-XAI, an explainable backpropagation neural network framework with genetic algorithm–based hyperparameter optimization. The framework integrates SHAP-and LIME-based interpretability to deliver transparent, auditable predictions. Using a dataset of 1,200 anonymized medical software projects, GA-BP-XAI reduces MAE by 11.6% and improves R 2 from 0.902 to 0.927 compared with standard BP, outperforming strong baselines such as Linear Regression, Random Forest, and XGBoost. Explainability results highlight domain-relevant drivers including FunctionPoints, ComplianceLevel, and IntegrationComplexity, consistent with expert knowledge. These results demonstrate that GA-BP-XAI achieves both state-of-the-art predictive accuracy and regulatory-aligned transparency, supporting trustworthy decision-making in compliance-driven, high-stakes environments.