Maximum Entropy Sequential Design with ML-II, INLA, and MCMC Updating: A Comparative Study
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper investigates maximum entropy sequential design for deterministic computer experiments using Gaussian process surrogates under a fixed simulation budget. We systematically compare three hyperparameter updating strategies: (i) Type-II maximum likelihood (ML-II/empirical Bayes) via marginal likelihood maximization, (ii) INLA-based approximate Bayesian updating, and (iii) maximum a posteriori (MAP) with full Bayesian propagation using MCMC. Performance is evaluated using pointwise RMSE, posterior predictive RMSE, integrated posterior variance, an entropy proxy, and computational cost. On the Forrester and Branin benchmarks, ML-II and MAP+FullBayes achieve comparable pointwise accuracy, but ML-II contracts uncertainty more aggressively, while MAP+FullBayes retains larger uncertainty due to hyperparameter propagation. INLA maintains higher integrated variance and incurs substantially greater computational cost under the present configuration. Our findings demonstrate that entropy-based sampling reliably identifies informative regions, while the updating mechanism governs the trade-off between computational efficiency and uncertainty quantification.