From Sampling to Structure: Constructive Optimisation for Hyperparameter Generalisation
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Traditional hyperparameter optimisation (HPO) methods operate within fixed, predefined search spaces, assuming that optimal configurations reside in statically bounded regions and may be uncovered through sampling or surrogate-guided selection. However, these methods encode inductive biases about the structure of the underlying optimisation landscape. These assumptions often fail under real-world conditions characterised by noise, heterogeneity, and evolving model architectures. This paper introduces ECO, an evolutionary cellular optimisation algorithm designed to operate under a fundamentally different premise: that search spaces themselves may be constructed, not merely interrogated.ECO models each hyperparameter as an evolving cellular lattice of alleles, dynamically expanded, refined, or pruned in response to observed fitness feedback. Through the interplay of evolutionary selection and cellular automata dynamics, ECO treats the hyperparameter space as a generative substrate, one that adapts its resolution and topology over time. We evaluate ECO across three diverse tasks, retinal optical coherence tomography (ROCT) classification, chest X-ray pathology detection, and sentiment analysis with BERT, each chosen to represent distinct structural and noise regimes. In all cases, ECO exhibits late-stage performance improvements, structural parameter sensitivity, and adaptive prioritisation of influential hyperparameters.These findings support ECO's central claim: that a generative, feedback-driven construction of the search space can yield more robust, generalisable, and context-sensitive optimisation. We position ECO not merely as an HPO algorithm, but as a representative of a new class of constructive optimisers whose adaptive structure offers a principled alternative to fixed-bias methods in modern ML.