Genetic Algorithms
Assignment 2
Chris Miles

This assignment is to use a ga to solve the first 4 Dejong functions. Code is included below, as well as descriptions of the problems and graphs of the ga's performance.

 

Code - Everything is in a nice tgz file includes:

all the code, parameter files, the graphs included as well as the logs of the runs that produced this data.

The parameters file contains all the important settings. These graphs were obtained with the following settings, except for problem 4 which had a population size and number of generations = 100. This was to offset the increased size of its chromosome (480bits).

Parameter Value Description
Popsize 50 Total number of individuals in the population
Numgens 50 Number of generations evaluated.
BitsPerFloat 16 Number of bits allocated per float.
Mutationprob .01 The probability that any given bit is flipped between generations.
Crossoverprob .6 The probability of crossover occurring between neighboring individuals between generations
Elitists 1 These many of the best individuals always survive to successive generations
problem 1-4 Determines the genome mapping and fitness function used.

Number of runs

All graphs are correctly averaged over 100 independent runs of the ga's.

Float Mapping.

Floats are achieved by mapping a binary integer over the range of possible values. The precision is determined by the number of bits allocated per float. Larger numbers allow for more granularity in solutions but also increase computation time. Above 16 bits it is possible to exceed the precision allowed for in a long, in which case erratic results are produced. All of these runs were with the maximum precision.

Fitness Mapping

Since all the functions are minimization functions, I have inverted all of them to get a fitness function. This is nice in that it doesn't distort the selection pressure presented by the original problem, and it keeps me from having to include a graph for the object function, since these graphs are just upside down versions of them.

Problem 1

The minimum object function value is 0, since fitness needs to be maximized a simple negation solves that. The graph shows the ga converging over a nice smooth gradient towards the optimal. The derivative graph is nice and smooth, and shows the ga making strong progress at the beginning which tapers off as it approaches the optimal answer.

Problem 2

For this problem the objective function still has a minimum at 0. It is a more complicated expression then before, but it almost always starts with an initial member near 0. This seems odd but there are a very large number of near 0 solutions, and repeated runs of the ga have confirmed its not just a fluke. The derivitive of the max is flat, because it almost always starts out with one. The avg shows the smooth exponential curve one would expect as the superior individual begins to dominate the population.

Problem 3

For problem 3 the fitness function is a step function. The minimum is no longer at 0 though, since negative numbers are now possible in the sum. There are 5 x's summed, all floats between -5.12 and 5.12. 5 x's all casting to -5 gives the optimal minimum of -25 which it still easily progresses towards. The minimum is significantly more jaded then before though, this is because the problem is stepped, and smooth progressions are no longer possible. The derivative graphs here are smooth only because they are averaged, the juminess in its ability to improve showed that it wasn't moving along a smooth gradient, rather it was occasionaly making significant progress as it improved upon a block. The problems lack of being differentiable did not keep the ga from making progress though, as is shown by the ever positive fitness improvements.

Problem 4

Techinically on this problem the minimum is negative infinity, since the gaussian function has a discrete probability of returning any number, even a hugely negative number. Dropping the gaussian term (which averages to 0) , the mean becomes 0. Which the ga strictly approaches. Due to the large number of xi's (30), the population and number of generations was increased to 100 to reach a suitable answer. The average maximum was withing 5 (out of 552.8) of the minimum(0). Extending the runs to 500 generations produced best answers with a margin of error less then .5. A look at the derivative graph shows a relatively smooth descent, this is reasonable since all the variables have a strong gradient towards the optimum.