Analysing the Robustness of Evolutionary Algorithms to Noise: Refined Runtime Bounds and an Example Where Noise is Beneficial
We analyse the performance of well-known evolutionary algorithms (1+1)EA and (1+λ)EA in the prior noise model, where in each fitness evaluation the search point is altered before evaluation with probability p. We present refined results for the expected optimisation time of the (1+1)EA and the (1+λ)EA on the function LeadingOnes, where bits have to be optimised in sequence. Previous work showed that the (1+1)EA on LeadingOnes runs in polynomial expected time if p = O(( n)/n^2) and needs superpolynomial expected time if p = ω(( n)/n), leaving a huge gap for which no results were known. We close this gap by showing that the expected optimisation time is Θ(n^2) ·(Θ({pn^2, n})) for all p < 1/2, allowing for the first time to locate the threshold between polynomial and superpolynomial expected times at p = Θ(( n)/n^2). Hence the (1+1)EA on LeadingOnes is much more sensitive to noise than previously thought. We also show that offspring populations of size λ> 3.42 n can effectively deal with much higher noise than known before. Finally, we present an example of a rugged landscape where prior noise can help to escape from local optima by blurring the landscape and allowing a hill climber to see the underlying gradient. We prove that in this particular setting noise can have a highly beneficial effect on performance.
READ FULL TEXT