Machine-Learning-Based Optimal Feed Rate Determination in Machining: Integrating GA-Calibrated Cutting Force Modeling and Vibration Analysis
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Machining efficiency and stability are crucial for achieving high-quality manufacturing outcomes. One of the primary challenges in machining is the suppression of chatter, which negatively impacts surface finish, tool longevity, and overall process reliability. This study proposes a machine learning-based approach to optimize feed rate in machining operations by integrating a genetic algorithm (GA)-calibrated cutting force model with vibration analysis. A theoretical cutting force dataset is generated under varying machining conditions, followed by frequency-domain analysis using Fast Fourier Transform (FFT) to identify feed rates that minimize chatter. These optimal feed rates are then used to train an Extreme Gradient Boosting (XGBoost) regression model, with Bayesian optimization employed for hyperparameter tuning. The trained model achieves an R2 score of 0.7887, indicating strong prediction accuracy. To verify the model’s effectiveness, robotic milling experiments were conducted using a UR10e manipulator. Surface quality evaluations showed that the model-predicted feed rates consistently resulted in better surface finish and reduced chatter effects compared to conventional settings. These findings validate the model’s ability to enhance machining performance and demonstrate the practical value of integrating simulated dynamics and machine learning for data-driven parameter optimization in robotic systems.