Next: References Up: Evolutionary Algorithms Previous: Evolutionary Algorithms

Evolutionary Algorithms

Problem solving methods based on natural evolutionary paradigms are being increasingly used. Collectively called Evolutionary Algorithms (EAs), these robust heuristic approaches have been successfully applied in several different fields such as all kinds of optimization problems, searching, classification and machine learning.

Evolutionary Algorithms (EAs) are generally based on the iterative simulated evolution of a constant size population of individuals. Each individual represents a feasible solution in some problem space through a suitable coding. The more common coding is binary strings; however, several other codings have been devised and used. Each individual has attached to it a fitness measure, which is related to the problem objective function and gives a measure of the quality of the solution (i.e. individual). There are many variations on the central theme among EAs. A typical algorithm, also called a Generational Genetic Algorithm (GA) with fitness-proportionate reproduction is described below. An initial population of a few tens to a few hundreds individuals is produced at random or heuristically. During each iteration step, called a generation, the individuals in the current population are evaluated and given a fitness value. To form a new population, individuals are selected with a probability proportional to their relative fitness, which ensures that fitter individuals have more chances of surviving and reproducing. However, selection and reproduction alone cannot sample any new point in the search space. To generate new individuals starting from the selected ones EAs traditionally use two so called genetic operators: crossover and mutation. Crossover takes two individuals called parents and produces two new individuals called the offspring by swapping parts of the parents. In its simplest form the operator works by exchanging substrings after a randomly selected crossover point. It has been shown that, under certain conditions, crossover biases the search towards promising regions of the search space by sticking together useful building blocks. Mutation, serves to prevent premature loss of population diversity by randomly sampling new points in the search space. Mutation rates are kept small however, otherwise the process degenerates into a random search. To bit strings, mutation is applied by flipping one or more random bits in a string with a probability equal to the mutation rate. EAs are stochastic iterative algorithms without converge guarantee. Termination may be triggered by reaching a maximum number of generations or by finding an acceptable solution by some criterion. Complete descriptions are to be found in the references.

Evolutionary Algorithms were born in a sequential setting for technical reasons but are naturally parallel (3). A simple and effective embarassingly parallel approach simply consists in running the same EA on each processor with possibly different parameters or differently generated initial populations. If the evaluation function is the time-consuming part of the computation, which is often the case in real-world problems, then a useful parallel decomposition is to have each individual's fitness calculated by a different task simultaneously.

Several other parallel implementations have been proposed. They all partition the population into subpopulations that evolve independently with a periodic communication phase between subpopulations. This communication can take the form of an exchange of some good individuals, which help regenerate the converging subpopulations by injecting new genetic material to replace some of the worst subpopulation's individuals. This model is sometimes called the Island model and it is also suitable for workstation clusters, where the communication speed is limited. Taking time into account, the evolution of the subpopulations may be synchronous or asynchronous, although most current implementations are quasi-synchronous and follow the SPMD parallel programming model. Taking the subpopulation approach to the limit where a single individual is considered, together with some small local neighborhood around it. Selection and mating then take place locally, in the immediate neighborhood of each individual simulataneously. Some non-local effects can be added by allowing individuals to diffuse or migrate throughout the grid.

A new approach to simulated evolution is the Genetic Programming method (GP). GP is roughly analogous in its main lines to GAs. The mayor difference lies in the representation of individuals. Whereas GAs or other evolutionary techniques employ bit or integer strings or real values all of fixed length, in GP each individual is actually a program to solve a given task that gets executed for evaluation. The program is customarily represented as a parse tree and different individuals can have very high variability. The genetic operators are redefined accordingly with crossover acting on subtrees. Genetic Programming is discussed at length in reference (4).

Next: References Up: Evolutionary Algorithms Previous: Evolutionary Algorithms
Wed Nov 9 17:48:36 MET 1994