HomOpt: A Flexible Homotopy-Based Hyperparameter Optimization Method
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Over the past few decades, machine learning has made remarkable strides, owed largely to algorithmic advancements and the abundance of high-quality, large-scale datasets. However, an equally crucial aspect in achieving optimal model performance is the fine-tuning of hyperparameters. Despite its significance, hyperparameter optimization (HPO) remains challenging due to several factors. Many existing HPO techniques rely on simplistic search methods or assume smooth and continuous loss functions, which may not always hold true. Traditional methods like grid search and Bayesian optimization often struggle to adapt swiftly and efficiently navigate the loss landscape. Moreover, the search space for HPO is frequently high-dimensional and non-convex, posing challenges in efficiently finding a global minimum. Additionally, optimal hyperparameters can vary significantly based on the dataset or task at hand, further complicating the optimization process. To address these challenges, this paper presents HomOpt, an advanced HPO methodology that integrates a surrogate model framework with homotopy optimization techniques. Unlike rigid methodologies, HomOpt offers flexibility by incorporating diverse surrogate models tailored to specific optimization tasks. Our initial investigation focuses on leveraging Generalized Additive Model (GAM) surrogates within the HomOpt framework to enhance the effectiveness of existing optimization methodologies. HomOpt's ability to expedite convergence towards optimal solutions across varied domain spaces, encompassing continuous, discrete, and categorical domains is highlighted. We conduct a comparative analysis of HomOpt applied to multiple optimization techniques (e.g., Random Search, TPE, Bayes, and SMAC), demonstrating improved objective performance on numerous standardized machine learning benchmarks and challenging open-set recognition tasks. We also integrate CatBoost within the HomOpt framework as a surrogate, showcasing its adaptability and effectiveness in handling more complex datasets. This integration facilitates an evaluation against state-of-the-art methods such as BOHB, particularly on challenging computer vision datasets like CIFAR-10 and ImageNet. Comparative analyses reveal HomOpt's competitive performance with reduced iterations and underscore potential optimizations in execution time. All the experimentation and method code can be found here: https://github.com/sabraha2/HOMOPT