Stochastic Algorithms are primarily global optimization algorithms. A stochastic process is one whose behavior is non-deterministic. The system’s subsequent state is determined both by the process’ predictable actions and by a random element. The main strategy (with some exceptions) are based on sampling the search space, calculating the ‘cost’ and using that to improve on the result. This can be either through a random process, changing search direction or step size. There is no ‘inspiring’ system, either natural or biological on which these are based. The first part details the implementation in python of four algorithms: Random Search (RS), Adaptive Random Search (ARS), Stochastic Hill Climbing (SHC) and Scatter Search (SC). Out of these except for Stochastic Hill Climbing , the remaining three use the same objective function for calculating the cost.
The objective function I chose for RS, ARS and SC was a variation on the one provided by Jason. He used a function that was a ‘sum of squares” , I used a variation that would provide a non trivial solution value, while keeping the structure of the function the same.
The optimal solution (minimum) for my basin function can be found at the value -10. The input value would be 2.
For SHC I used the same ‘One Max‘ Problem from the original Clever Algorithms book.
Download source code. Unzip and import the attached folder in to your Eclipse workspace. It includes a test suite for exercising the algorithms
Random Search (RS)
Random search has no memory of previous search candidates. So it might revisit previously considered candidate solutions. The advantage is, it is a minimal solution. Its worst case performance is worse than a straight linear approach. But it has its uses in a domain where the values are continuous. Most commonly it is used as an embedded heuristic in other search algorithms for refining a local optima. It is one of the building blocks for other more advanced search techniques.
Adaptive Random Search (ARS)
Adaptive Random Search seeks to avoid the problems of localized random search by sampling with larger or smaller step sizes. We keep trying larger step sizes as long as we have improvement in the result. We go back to smaller step sizes if we don’t see an improvement in the result after a preset max count. The initial starting point is selected similar to that of RS.
Scatter Search (SC)
Scatter search is a global optimization algorithm, which has some Evolutionary computation features. The premise of SC is that the desired result is present in some combination of a diverse set of high quality candidate sets. By combining and recombining the values from a reference set of elite solutions to create ‘offsprings’. We then test these offsprings for best values by applying the objective function to them. We use an embedded heuristic of RS for refining a local candidate solution.
Stochastic Hill Climbing (SHC)
Stochastic Hill Climbing iterates the process by randomly selecting a neighbor of a local candidate solution and evaluating it. It is an improvement on RS, where we don’t sample randomly, but near neighbors of randomly selected candidate solution. I noticed an error in the code listing provided by Jason Brownlee. Specifically on page 40 last line. ( mutant[pos] = (mutant[pos]==’1′) ? ‘0’ : ‘1’ ) . It should instead check for ‘0’ and replace it with ‘1’. I believe it is a typo in the book. I also tried to evaluate the search efficacy for the OneMax problem. Since the idea is to replace all zeros in the bitstring with 1. A search efficacy metric can be defined as (FinalOneMaxCount – InitialOneMaxCount)/finalIteration. If we set our max iterations to the length of the initial bitstring, we get a nice metric for measuring the search efficacy.