Acoustic Cues into a Surgeon-assist Physical AI for Detecting Bone Penetration During Spinal Surgery
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Purpose Skilled surgeons can detect subtle changes in bone-cutting sounds to recognize bone penetration, but acoustic cues are subjective and dependent on surgical experience. The objective of this study was to develop an artificial intelligence (AI) model capable of detecting bone penetration from intraoperative percussion sounds. Methods A total of 1,236 chisel strikes were identified from intraoperative recordings obtained during lumbar and thoracic spinal decompression surgeries performed using a chisel, and were labeled as penetration or non-penetration. Acoustic features were extracted per strike and expanded across 3-hit sliding windows to capture dynamic temporal changes. A gradient boosting machine learning classifier (LightGBM) was trained, with 10% of data as an independent test set. Model performance was primarily evaluated using the area under the precision–recall curve (PR-AUC) and receiver operating characteristic AUC (ROC-AUC). Results On the independent test set, the model yielded a ROC-AUC of 0.838 and a PR-AUC of 0.604. In a sensitivity analysis, the model achieved a sensitivity of 0.828, specificity of 0.767, accuracy 0.784, F1-score 0.686, and precision 0.585. Feature importance analysis revealed that dynamic changes between consecutive strikes, such as mel-frequency cepstral coefficients (MFCCs), zero-crossing rate, and spectral contrast, were the most influential predictors. Conclusion An AI model analyzing intraoperative percussion sounds demonstrated a robust ability to detect bone penetration, with consistent ROC performance and clinically acceptable PR-AUC. Quantifying acoustic cues traditionally interpreted subjectively by surgeons may support intraoperative decision-making and surgical training.