Lecture 8: Evolutionary Robotics
Teacher: Stéphane Doncieux, Nicolas Bredèche
Random generation ⟶ Evaluation ⟶ Selection ⟶ Variation (Until termination)
⇒ Evolutionary Computation
EvolutionaryAlgorithm():
t = 0
RandomInitialization(P_t):
for i in range(N+1):
Evaluate(P_i(t))
while condition_not_met:
Q = 0
for i in range(N+1):
x = Recombine_select_repro(P_t)
x = Mutate(x)
Evaluate(x)
Q = x ∪ Q
P(t+1) = Select_replace(P_t, Q)
t += 1
Algorithms
Microbial GA
Very simple algorithm, very few prerequisites
- Start from a genotype population
- Random selection
- Ranking
-
Modify the losing genotypes:
- Random recombination: inject a part of the winner in the losers
- Mutation in the losers
- Repeat
Efficient parameter optimization: CMA-ES
State-of-the art parameter optimization algorithm
- Initial sampling ⟶ selection according to fitness
- Parametric models on this distribution (mean, covariance matrix)
- Make mutations in the gradient (of the fitness function) direction
Multi-objective optimization: NSGA-II
Multi-objective optimization: no total order
⟹ we focus on the Pareto boundary: solutions that don’t make any compromise (NSGA-II: state-of-the-art)
Leave a comment