# Lecture 8: Evolutionary Robotics

Teacher: Stéphane Doncieux, Nicolas Bredèche

Random generation ⟶ Evaluation ⟶ Selection ⟶ Variation (Until termination)

⇒ Evolutionary Computation


EvolutionaryAlgorithm():
t = 0

RandomInitialization(P_t):
for i in range(N+1):
Evaluate(P_i(t))

while condition_not_met:
Q = 0

for i in range(N+1):
x = Recombine_select_repro(P_t)
x = Mutate(x)
Evaluate(x)
Q = x ∪ Q
P(t+1) = Select_replace(P_t, Q)
t += 1


# Algorithms

## Microbial GA

Very simple algorithm, very few prerequisites

1. Start from a genotype population
2. Random selection
3. Ranking
4. Modify the losing genotypes:

• Random recombination: inject a part of the winner in the losers
• Mutation in the losers
5. Repeat

## Efficient parameter optimization: CMA-ES

State-of-the art parameter optimization algorithm

1. Initial sampling ⟶ selection according to fitness
2. Parametric models on this distribution (mean, covariance matrix)
3. Make mutations in the gradient (of the fitness function) direction

## Multi-objective optimization: NSGA-II

Multi-objective optimization: no total order

⟹ we focus on the Pareto boundary: solutions that don’t make any compromise (NSGA-II: state-of-the-art)

Tags:

Updated: