Using exploration and exploitation techniques to improve ranking models through (1+1)-Evolutionary Algorithms
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Exploration and exploitation are fundamental concepts within the domain of Nature-Inspired Algorithms (NIAs) when optimizing solutions. Exploration aims to traverse a substantial portion of the solution space, whereas exploitation is directed at refining the current solution toward either local or global optima. For example, mutation represents an exploration technique, while crossover and enhanced initialization strategies are employed for exploitation. In this research, four probability distributions are employed for mutation within NIAs, and ranking models derived from Dependent Click and Linear Regression (LR) are leveraged to enhance the exploitation process. Empirical findings indicate a preference for employing Gaussian Random Number (GRN) in conjunction with LR and Dependent Click initialization when evolving on the training dataset. Conversely, Levy Random Number generation in combination with LR demonstrates superior performance on the unseen test dataset, with GRN emerging as a strong contender. Furthermore, the integration of Simulated Annealing with the (1+1)-Evolutionary Strategy outperforms alternative methods in predictive ranking and evolutionary processes, both on the training and testing datasets, regardless of whether Linear ranking initialization is employed, in contrast to the novel (1+1)-Evolutionary Gradient Strategy and other (1+1)-Evolutionary Strategy variants. This paper presents experimental results conducted on the MQ2008 dataset and provides accompanying code packages.