Lecture 8: Evolutionary Robotics

Teacher: Stéphane Doncieux, Nicolas Bredèche

Random generation ⟶ Evaluation ⟶ Selection ⟶ Variation (Until termination)

⇒ Evolutionary Computation

    t = 0

        for i in range(N+1):

            while condition_not_met:
                Q = 0

                for i in range(N+1):
                    x = Recombine_select_repro(P_t)
                    x = Mutate(x)
                    Q = x  Q
                P(t+1) = Select_replace(P_t, Q)
                t += 1


Microbial GA

Very simple algorithm, very few prerequisites

  1. Start from a genotype population
  2. Random selection
  3. Ranking
  4. Modify the losing genotypes:

    • Random recombination: inject a part of the winner in the losers
    • Mutation in the losers
  5. Repeat

Efficient parameter optimization: CMA-ES

State-of-the art parameter optimization algorithm

  1. Initial sampling ⟶ selection according to fitness
  2. Parametric models on this distribution (mean, covariance matrix)
  3. Make mutations in the gradient (of the fitness function) direction

Multi-objective optimization: NSGA-II

Multi-objective optimization: no total order

⟹ we focus on the Pareto boundary: solutions that don’t make any compromise (NSGA-II: state-of-the-art)

Leave a comment