SADE(Self-Adaptive Differential Evolution) overview, algorithms and implementation examples.

Mathematics Machine Learning Artificial Intelligence Graph Data Algorithm Programming Digital Transformation Algorithms and Data structures Navigation of this blog

Overview of SADE(Self-Adaptive Differential Evolution)

Self-Adaptive Differential Evolution (SADE) is a method based on Differential Evolution, a type of evolutionary algorithm, which automates parameter adjustment and improves the adaptability of the algorithm.

In normal Differential Evolution (DE), multiple parameters (e.g. mutation rate \(F\) and crossover rate \(CR\)) need to be set in advance during the search, but these settings are problem-dependent and therefore difficult to adjust, SADE adjusts these parameters in a self-adaptive manner in the evolutionary process, thereby improving search efficiency and The method is designed to improve solution quality.

DE evolves through the following processes

  1. Initialisation: random generation of initial populations.
  2. Mutation: a new vector is generated using the difference vectors of the other individuals.
    – Example: \(\mathbf{v}_{i} = \mathbf{x}_{r1} + F \cdot (\mathbf{x}_{r2} – \mathbf{x}_{r3}) \)
    – \f\: Mutation rate (scaling factor)
  3. Crossover: intersecting the mutation vector with the parent individuals to generate a new test solution.
    – \(u_{i,j} = \begin{cases}
    x_{i,j} & \text{if } rand_j > CR \
    v_{i,j} & \text{otherwise}
    \end{cases} \)
    – \(CR\): crossing rate
  4. Selection: comparing the test solution with the parental individuals and passing on the superior one to the next generation.

The main features of SADE are as follows

  • Self-adaptive parameter adjustment: DE performance is highly dependent on the values of \(F\) and \(CR\), which in SADE are not fixed values but self-adjusted during evolution. Each individual has different parameters, which are adaptively updated and a mechanism is introduced to select the best parameters based on past performance.
  • Use of multiple mutation strategies: whereas normal DEs use a single mutation strategy (e.g. DE/rand/1/bin), SADEs try multiple mutation strategies and adjust selection probabilities based on the success rate of each. Commonly used mutation strategies include DE/rand/1, DE/best/1 and DE/current-to-best/1.
  • Recording of success history: the results of the application of each strategy and parameter are recorded and the feedback is used to control the optimisation process.

The basic flow of SADE is shown below.

  1. Initialisation: randomly generated populations. Initial parameters \(F\) and \(CR\) for each individual.
  2. Mutation and crossover application: multiple mutation strategies are tried to generate test solutions.
  3. Selection: compare the test solution with the parental individuals to determine the next generation individuals.
  4. Adaptive updating of parameters: record the success history of each strategy and adjust FGDs and CRCs.
    • \(F\): updated by random sampling based on Gaussian distribution, etc.
      \(CR\): Using the Bernoulli distribution.
    • Stop decision: loop until stop conditions (e.g. maximum number of generations, convergence criteria, etc.) are met.

Advantages of SADE include

  • Improved search efficiency: no parameter tuning is required, resulting in improved search performance.
  • Maintaining diversity: using multiple mutation strategies together makes it difficult to fall into locally optimal solutions.
  • Reduced problem dependence: self-adaptive parameter adjustment allows flexible adaptation to different optimisation problems.

SADE will be the approach of interest as a method that significantly improves the applicability of DE to complex real-world problems while maintaining its flexibility.

implementation example

The following is an example of a simple implementation of Self-Adaptive Differential Evolution (SADE) using Python. In this example, the Rastrigin function is minimised as a benchmark function.

SADE implementation example

import numpy as np

# Rastrigin function (evaluation function)
def rastrigin(x):
    return 10 * len(x) + sum(xi**2 - 10 * np.cos(2 * np.pi * xi) for xi in x)

# Initialisation of parameters
def initialize_population(pop_size, dim, bounds):
    return np.random.uniform(bounds[0], bounds[1], (pop_size, dim))

# Initialise self-adaptive parameters.
def initialize_adaptive_params(pop_size):
    F = np.random.uniform(0.1, 0.9, pop_size)  # mutation rate
    CR = np.random.uniform(0.1, 0.9, pop_size)  # intersecting ratio
    return F, CR

# mutationism
def mutate(pop, F, idx):
    r1, r2, r3 = np.random.choice([i for i in range(len(pop)) if i != idx], 3, replace=False)
    return pop[r1] + F * (pop[r2] - pop[r3])

# crossover operation
def crossover(parent, donor, CR):
    dim = len(parent)
    crossover_mask = np.random.rand(dim) < CR
    if not np.any(crossover_mask):  # Always change at least one gene.
        crossover_mask[np.random.randint(0, dim)] = True
    return np.where(crossover_mask, donor, parent)

# SADE algorithm
def sade(func, dim, bounds, pop_size=20, max_gen=100):
    pop = initialize_population(pop_size, dim, bounds)
    F, CR = initialize_adaptive_params(pop_size)
    fitness = np.array([func(ind) for ind in pop])
    best_solution = pop[np.argmin(fitness)]
    best_fitness = np.min(fitness)

    # main loop
    for gen in range(max_gen):
        for i in range(pop_size):
            # Mutation
            donor = mutate(pop, F[i], i)
            # boundary check
            donor = np.clip(donor, bounds[0], bounds[1])
            # (genetic) crossing over
            trial = crossover(pop[i], donor, CR[i])
            # Evaluation of test solutions
            trial_fitness = func(trial)

            # Selection
            if trial_fitness < fitness[i]:
                pop[i] = trial
                fitness[i] = trial_fitness
                # parameter adaptation
                F[i] = np.clip(F[i] * 1.1, 0.1, 0.9)  # Adaptively adjust F
                CR[i] = np.clip(CR[i] + 0.1 * np.random.uniform(-0.1, 0.1), 0.1, 0.9)
            else:
                F[i] = np.clip(F[i] * 0.9, 0.1, 0.9)  # Adaptively adjust F

        # Update on best solutions
        gen_best_idx = np.argmin(fitness)
        if fitness[gen_best_idx] < best_fitness:
            best_fitness = fitness[gen_best_idx]
            best_solution = pop[gen_best_idx]

        print(f"Generation {gen + 1}: Best Fitness = {best_fitness:.6f}")

    return best_solution, best_fitness

# Execution
if __name__ == "__main__":
    dim = 10
    bounds = (-5.12, 5.12)  # Rastrigin function definition area
    solution, fitness = sade(rastrigin, dim, bounds)
    print("Best Solution:", solution)
    print("Best Fitness:", fitness)

Code description.

  1. Evaluation function: defines the Rastrigin function as the function to be optimised. This is often used in benchmarking of optimisation algorithms.
  2. Initialisation: randomly generate populations (initialize_population) and adaptive parameters \(F\), \(CR\)(initialize_adaptive_params).
  3. Mutation and crossover: in mutation, new test solutions are generated using individual differences; in crossover, test solutions are combined with parent individuals with a certain probability.
  4. Adaptive parameter adjustment: adjusts each individual’s \(F\) and \(CR\) based on its degree of success. Parameters are increased for success and decreased for failure.
  5. Stopping condition: repeat until the maximum number of generations is reached or the minimum value of the objective function is sufficiently small.

Example results.

  • The best fitness value (minimum value of the objective function) is output for each generation.
  • Finally, the optimal solution and its evaluation value are displayed.

Application: this implementation can be applied to any numerical optimisation problem and can be applied to other problems (e.g. hyper-parameter adjustment of machine learning models, allocation problems, etc.) simply by changing the evaluation function.

Application examples

The following sections describe specific applications.

1. hyperparameter tuning of machine learning models

  • Case study: optimisation of hyperparameters (e.g. learning rate, dropout rate, number of layers) of a deep learning model (e.g. CNN, RNN).
  • Background: adaptive search using SADE to improve computational efficiency when there are many hyperparameter choices and conventional grid search or random search is computationally expensive.
  • Outcome: improved model accuracy and reduced training time.

2. optimising network design

  • Case study: topology optimisation in the design of telecommunication networks.
  • Background: the problem of minimising communication delays and network costs requires exploration of multiple constraints (e.g. bandwidth, delay, redundancy).
  • Outcome: reduced costs and improved communication efficiency.

3. optimising energy management systems

  • Case study: optimisation of energy supply and consumption using renewable energy sources (e.g. solar and wind power).
  • Background: the use of renewable energies with fluctuating generation requires flexible scheduling to balance supply and demand.
  • Outcome: reduced energy consumption costs and increased energy efficiency.

4. structural design and engineering

  • Case study: optimisation in the design of building and aircraft components.
  • Background: problem of minimising material costs and weight while maximising structural strength.
  • Outcome: design of high-strength, lightweight structures while reducing material use.

5. investment portfolio optimisation

  • Case study: risk-return optimisation of an investment portfolio containing multiple asset classes.
  • Background: to find the optimum investment ratio, taking into account correlations between assets and risk tolerance.
  • Outcome: reduced investment risk and increased returns.

6. application in the field of medicine

  • Case study: molecular design optimisation in drug development.
  • Background: optimise the structure of a new drug molecule taking into account its molecular properties (e.g. activity, stability, toxicity).
  • Outcome: design of effective and safe drug candidates.

7. production scheduling

  • Case study: optimisation of a production schedule in a factory.
  • Background: multiple machines and work stages are present and production times need to be reduced while production costs are kept down.
  • Outcome: more efficient production line and shorter delivery times.

8. robot control

  • Case study: multi-legged robot movement pattern optimisation.
  • Background: optimisation of parameters to improve the stability and energy efficiency of the robot in different terrains.
  • Outcome: stable gait control and extended battery life.

9. solving complex logistics problems

  • Case study: optimisation of transport routes (e.g. truck deliveries, drone deliveries).
  • Background: search for routes that reduce fuel consumption while reducing delivery times.
  • Outcome: reduced delivery costs and increased customer satisfaction. 10.

10. optimising game AI parameters

  • Case study: evolutionary optimisation of game character strategies and behaviour rules.
  • Background: the game AI needs to learn the best behaviour according to the player’s level and style.
  • Outcome: more challenging and engaging gaming experience.

SADE’s ability to adaptively adjust the balance between exploration and exploitation makes it a particularly effective method for non-linear and multimodal optimisation problems.

参考図書

Reference books on Self-Adaptive Differential Evolution (SADE) and related optimisation methods are discussed.

Optimisation algorithms in general.
1. ‘Evolutionary Computation: A Unified Approach’.
Author: Kenneth A. De Jong
Year of publication: 2006
– This book provides a comprehensive overview of evolutionary computation, from basic concepts to applications.
– Useful for background understanding of evolutionary strategies and differential evolution, including SADE.

2. ‘Differential Evolution: A Practical Approach to Global Optimisation’.
Author(s): Kenneth Price, Rainer Storn, Jouni Lampinen
Year of publication: 2005
– This classic book covers the differential evolution (DE) algorithm from the basics to applications.
– It provides an in-depth study of how DE works as a basis for SADE.

3. ‘Introduction to Evolutionary Computing’.
Author(s): A. E. Eiben, J. E. Smith
Publication year: 2015 (2nd edition)
– A straightforward introduction to evolutionary computing in general.
– It provides useful hints for the design of adaptive algorithms.

Books dedicated to SADE and adaptive algorithms.
4. ‘Metaheuristics for Hard Optimisation: Methods and Case Studies’.
Author(s): Johann Dréo, Patrick Siarry, Arnaud Petrowski, Eric Taillard
Year of publication: 2006
– Describes metaheuristic methods, including differential evolution, with various examples.
– Examples of the application of adaptive methods such as SADE are also discussed.

5. ‘Advances in Differential Evolution’.
Editor: Uday K. Chakraborty
Year of publication: 2008
– A collection of papers covering the latest advances in differential evolution.
– Derivative methods such as SADE are also discussed.

6. ‘Handbook of Metaheuristics’.
Editors: Michel Gendreau, Jean-Yves Potvin
Year of publication: 2019 (3rd edition)
– A comprehensive manual on metaheuristic algorithms in general.
– It includes information on the principles and applications of SADE.

Books focusing on application examples.
7. ‘Differential Evolution Algorithm: Foundations and Perspectives

8. ‘Computational Intelligence in Optimisation: Applications and Implementations’.
Editors: Yoel Tenne, Chi-Keong Goh
Year of publication: 2010
– Presents various applications of optimisation methods.
– Relevant to applications of adaptive algorithms, including SADE.

9. ‘Optimization Algorithms: AI techniques for design, planning, and control problems’.

10. ‘Hands-On Meta Learning with Python’.

References.
– Qin, A. K., & Suganthan, P. N. (2005). ‘Self-adaptive differential evolution algorithm for numerical optimization,’ Proceedings of the IEEE Congress on Evolutionary Computation.
– Hansen, N., & Auger, A. (2014). ‘Evolution strategies. ‘Springer Handbook of Computational Intelligence.
– Deb, K. (2001). ‘Multi-Objective Optimisation using Evolutionary Algorithms.

コメント

タイトルとURLをコピーしました