Data-Driven Prior Construction in Hilbert Spaces for Bayesian Optimization
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
We propose a variant of Bayesian optimization in which probability distributions are constructed using uncertainty quantification (UQ) techniques. In this context, UQ techniques rely on a Hilbert basis expansion to infer probability distributions from limited experimental data. These distributions act as prior knowledge of the search space and are incorporated into the acquisition function to guide the selection of enrichment points more effectively. Several variants of the method are examined, depending on the distribution type (normal, log-normal, etc.), and benchmarked against traditional Bayesian optimization on test functions. The results show competitive performance, with selective improvements depending on the problem structure, and faster convergence in specific cases. As a practical application, we address a structural shape optimization problem. The initial geometry is an L-shaped plate, where the goal is to minimize the volume under a horizontal displacement constraint expressed as a penalty. Our approach first identifies a promising region while efficiently training the surrogate model. A subsequent gradient-based optimization step then refines the design using the trained surrogate, achieving a volume reduction of more than 30% while satisfying the displacement constraint, without requiring any additional evaluations of the objective function.