Research on Welding Behavior Characterization and Quality Control Methods Based on Multi-Sensor Information Fusion
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
To address the challenges of real-time monitoring and control in manual welding and enable quantitative analysis of welder behavior, this study develops a manual welding information acquisition system based on multi-sensor fusion. An integrated multi-information sensing platform, encompassing welding torch posture, displacement velocity, acoustic sensors, and molten pool vision modules, is constructed. Microsecond-level synchronous acquisition and integration of multimodal data are realized via LabVIEW core and Audacity auxiliary. Two sets of experiments (automated welding sensor verification and manual welding defect detection) are designed to determine Q235 steel plate welding parameters and non-steady-state operation tasks. Welding behavior features are extracted from torch movement stability and speed uniformity; welding acoustic signals are analyzed via time-frequency domain methods for defect identification; and an improved DeeplabV3 + network is used for molten pool image segmentation to extract geometric dimensions. Experimental results verify the sensors’ high precision and reliability. Comparative analysis of defect-free and defective welding processes confirms that the synergistic stability of torch posture, welding speed, molten pool dynamics, and acoustic characteristics underpins high-quality weld formation, with transient anomalies in these signals serving as defect precursors. This work clarifies the causal relationship between multimodal process characteristics and manual welding quality. The proposed system and methods provide a reference for welder skill assessment and training, and lay a foundation for real-time multi-modal defect detection in robotic automated welding.