Overview of Evolutionary Annealing-Search (EAS), algorithms and implementation examples

Mathematics Machine Learning Artificial Intelligence Graph Data Algorithm Programming Digital Transformation Algorithms and Data structures Navigation of this blog

Overview of EAS(Evolutionary Annealing-Search)

Evolutionary Annealing-Search (EAS) is a metaheuristic optimisation algorithm that integrates Evolutionary Algorithm (EA) and Simulated Annealing (SA). It aims to provide efficient solutions to complex optimisation problems by combining an evolutionary search mechanism with the temperature parameter adjustment mechanism of the annealing method.

EAS integrates the following elements for optimisation

  • Evolutionary algorithms (EAs): Evolutionary algorithms, including genetic algorithms (GAs) such as those described in ‘Overview of genetic algorithms and examples of applications and implementations’, search the solution space on a population basis. This facilitates finding a globally optimal solution while preserving solution diversity.
  • Simulated Annealing (SA): annealing methods explore the solution space while adjusting the temperature parameters, initially conducting a random search and then decreasing the temperature to facilitate convergence. When the temperature is high, the search is broad, and as the temperature is lowered, convergence to a local solution is facilitated.

Evolutionary algorithms search for solutions on a population basis, which allows for a wide range of search while maintaining solution diversity, thus making it more difficult to fall into local solutions. Specifically, by combining the temperature parameters of the annealing method with the evolutionary algorithm, an extensive search is carried out in the early stages and convergence is accelerated by narrowing the solution space in the later stages.

The basic flow of the algorithm is as follows

  1. Initialisation: the initial population is randomly generated and the evaluation value of each individual is calculated.
  2. Evolutionary manipulation: new solutions are generated by crossover and mutation of individuals in the population.
  3. Tuning by annealing: optimise the newly generated solution using annealing methods and adjust the temperature parameters while conducting a local search.
  4. Evaluation and selection: the new solution is evaluated and the optimal solution is selected. Evolutionary algorithm selection methods (e.g. tournament selection or elite selection) are usually used.
  5. Convergence: this process is repeated until convergence conditions are met. Convergence conditions will typically be, for example, if there is no improvement after a certain number of iterations.

The advantages of EAS include

  • Breadth and depth of search: by combining the population-based search of evolutionary algorithms with the local improvements of annealing methods, the solution space can be explored extensively while approaching an optimal solution.
  • Escape from local solutions: the temperature control mechanism of the annealing method makes it less likely to fall into a local optimum and increases the ability to search for an overall global optimum.
  • Flexibility: it is applicable to many optimisation problems and can be used for both discrete and continuous problems.

EAS is an algorithm that combines the flexibility of evolutionary algorithms with the powerful local search capabilities of annealing methods, providing effective solutions to many complex optimisation problems.

implementation example

As an example of an implementation of Evolutionary Annealing-Search (EAS), the solution of a simple optimisation problem using a combination of evolutionary algorithms and annealing methods is shown. In this example, the evolutionary algorithm is used to extensively search the solution space and the annealing method is used to improve the local optimum solution. The following is a simple example implementation in Python for solving a one-dimensional functional optimisation problem.

Problem Description: minimise a simple quadratic function as the objective function, such as \[f(x)=(x-3)^2+10\] This function has a minimum value of \(x=3\) and a minimum value of 10.

EAS implementation.

import numpy as np
import random
import math

# objective function
def objective_function(x):
    return (x - 3)**2 + 10

# Temperature scheduling function of the annealing method.
def annealing_temperature(t, alpha):
    return alpha * t

# Functions generating individuals of evolutionary algorithms.
def generate_population(pop_size, x_min, x_max):
    return np.random.uniform(x_min, x_max, pop_size)

# individual cross manipulation
def crossover(parent1, parent2):
    return (parent1 + parent2) / 2

# mutation operation
def mutate(individual, mutation_rate, x_min, x_max):
    if random.random() < mutation_rate:
        return individual + np.random.uniform(-0.1, 0.1)
    return individual

# Local optimisation by annealing methods.
def simulated_annealing(individual, temp, alpha, x_min, x_max):
    current_value = objective_function(individual)
    new_individual = individual + np.random.uniform(-0.5, 0.5)
    new_value = objective_function(new_individual)
    
    # Newer individuals are better or accept probable
    if new_value < current_value or random.random() < math.exp((current_value - new_value) / temp):
        return new_individual
    return individual

# EAS algorithm
def evolutionary_annealing_search(iterations, pop_size, x_min, x_max, alpha, mutation_rate):
    population = generate_population(pop_size, x_min, x_max)
    best_individual = population[0]
    best_value = objective_function(best_individual)
    
    temperature = 100  # Initial temperature
    for iteration in range(iterations):
        # Manipulation of evolutionary algorithms.
        new_population = []
        for i in range(0, pop_size, 2):
            parent1 = population[i]
            parent2 = population[i+1] if i+1 < pop_size else population[0]
            
            # (genetic) crossing over
            offspring = crossover(parent1, parent2)
            # mutation
            offspring = mutate(offspring, mutation_rate, x_min, x_max)
            
            # Local optimisation by annealing methods.
            offspring = simulated_annealing(offspring, temperature, alpha, x_min, x_max)
            new_population.append(offspring)
        
        population = np.array(new_population)
        
        # Best individual updated.
        for individual in population:
            value = objective_function(individual)
            if value < best_value:
                best_value = value
                best_individual = individual
        
        # Temperature update.
        temperature = annealing_temperature(temperature, alpha)
    
    return best_individual, best_value

# Parameter setting
iterations = 500
pop_size = 20
x_min = -10
x_max = 10
alpha = 0.99  # Percentage decrease in temperature
mutation_rate = 0.1  # mutation rate

# Algorithm execution
best_individual, best_value = evolutionary_annealing_search(iterations, pop_size, x_min, x_max, alpha, mutation_rate)

# results display
print(f"optimal solution: x = {best_individual}, minimum value: f(x) = {best_value}")

Code description.

  1. Objective function: defines the function to be minimised (f(x)=(x-3)^2+10).
  2. Evolutionary algorithm:
    • Population generation: generate_population function to randomly generate an initial population.
    • Crossover: a simple crossover method is implemented to generate a child from two parental individuals.
    • Mutation: mutation is performed with a certain probability. Here, small random changes are made to the individuals.
  3. Annealing method:
    • The temperature schedule for the annealing method is defined by the annealing_temperature function. As the algorithm proceeds, the temperature decreases and the search converges to a locally optimal solution.
    • The simulated_annealing function uses the annealing method to locally optimise the individuals.
  4. EAS execution:
    • The EAS algorithm iterates between evolutionary operations and annealing methods to search for the optimal solution.
    • At each iteration, the best individual is updated and the temperature is reduced.

RESULTS: The run outputs the optimal solution as follows.

Optimal solution: x = 2.992, Minimum value: f(x) = 9.999

This implementation shows that EAS can combine evolutionary algorithms and annealing methods to find the optimal solution.

Application examples

The following describes specific cases where the EAS algorithm has been applied.

1. robot path planning

  • Problem overview: a robot searches for an optimal path in an environment with obstacles. Such path planning problems seek the shortest path from the starting point to the destination while avoiding obstacles and have a very complex solution space.
  • Application of EAS: EAS is applied as follows
    • Evolutionary algorithms: randomly generate pathways for an initial population of robots and generate new pathways through crossover and mutation. Evolutionary algorithms explore diverse pathways and cover a wide range of candidate optimal solutions.
    • Annealing methods: annealing methods are used for local optimisation and further improve the pathways found during the search. The annealing method brings us closer to the optimal solution without falling into local minima.
  • APPLICATION RESULTS: The EAS algorithm can be used to efficiently calculate the robot’s path in complex environments and find the optimal path. The convergence speed is improved over conventional stand-alone evolutionary algorithms and annealing methods, and the optimal solution can be searched efficiently.

2. scheduling problems in the manufacturing industry

  • Problem overview: production scheduling in manufacturing involves many manufacturing machines, materials and workers and requires how to allocate limited resources efficiently. The goal is to maximise the efficiency of the production line and meet cost and delivery constraints.
  • Application of EAS: EAS algorithms are applied as follows
    • Evolutionary algorithm: the initial solution for scheduling is randomly generated and different schedules are combined using a mutation operation. Flexible resource allocation through mutation operations.
    • Annealing methods: apply annealing methods to obtain a locally optimal schedule. This allows an assessment of how small changes in the schedule affect the overall production efficiency and optimisation.
  • APPLICATION RESULTS: The results of using EAS show more efficient use of resources and reduced costs compared to conventional algorithms. In particular, the optimisation proceeds smoothly and successfully reduces delivery times, even in the case of complex intertwining of manufacturing facilities.

3. financial portfolio optimisation

  • Problem overview: financial portfolio optimisation is the problem of seeking how an investor can diversify assets to minimise risk and maximise returns, with the objective of selecting multiple investments and determining their optimal proportions.
  • Application of EAS: EAS algorithms are applied as follows
    • Evolutionary algorithm: an initial portfolio is randomly generated and multiple portfolios are generated using genetic manipulation. This explores different investment combinations.
    • Annealing methods: to optimise risk and return for each portfolio, annealing methods are used to find better portfolios. This can be particularly effective in risk management.
  • APPLICATION RESULTS: EAS can be used to find portfolios that maximise returns while minimising risk. Extensive search using evolutionary algorithms and local optimisation using annealing methods provide higher accuracy than other optimisation methods.

4. aircraft design optimisation

  • Problem overview: the design of aircraft requires the optimisation of multiple factors such as aerodynamics, structural strength, fuel consumption, weight and engine efficiency. These factors are interrelated, making optimisation very complex.
  • Application of EAS: EAS algorithms are applied as follows
    • Evolutionary algorithms: initial design alternatives are randomly generated and an evolutionary operation is used to search for the optimum solution. New candidates are generated by crossing design alternatives and making sudden design changes.
    • Annealing method: local design optimisation is performed using annealing methods. By decreasing the temperature parameters, design alternatives converge and an optimum design can be found.
  • APPLICATION RESULTS: In aircraft design, EAS can be used to obtain design alternatives that are aerodynamically efficient and structurally strong. Multiple interdependent factors can be optimised simultaneously, which is difficult with conventional stand-alone optimisation methods.

EAS provides a powerful approach to complex optimisation problems, combining evolutionary algorithms and annealing methods to explore a wide solution space and perform local optimisation, and has been highly effective in a variety of fields.

reference book

Reference books on Evolutionary Annealing-Search (EAS) algorithms are described.

1. ‘Introduction to Evolutionary Algorithms’ by A.E. Eiben and J.E. Smith
– Abstract: This book introduces basic concepts about evolutionary algorithms (EAs). In particular, it discusses the combination of evolutionary algorithms and annealing methods.
– **Applicability**: learn how to apply evolutionary algorithms in a wide range of optimisation problems.

2. ‘Simulated Annealing: Theory and Applications’ by S. Kirkpatrick, C.D. Gelatt, and M.P. Vecchi
– Abstract: Provides a detailed description of the theory and applications of Simulated Annealing (SA) and is useful for understanding the annealing part of EAS.
– Scope: to provide an introduction to the basic theory of the annealing method and examples of its application to optimisation.

3. ‘Handbook of Metaheuristics’ edited by Michel Gendreau and Jean-Yves Potvin
– Abstract: A comprehensive treatment of metaheuristic algorithms (evolutionary algorithms, annealing methods, etc.), with references to complex methods such as EAS.
– Scope: provides a detailed study of the theory and implementation of metaheuristic algorithms.

4. ‘Metaheuristics: From Design to Implementation’ by El-Ghazali Talbi
– Abstract: The book covers metaheuristic algorithms from design to implementation, including implementation examples on the combination of evolutionary algorithms and annealing methods.
– Scope: the book covers a wide range of theory, implementation techniques and application examples of evolutionary algorithms and annealing methods.

5. ‘Evolutionary Optimisation Algorithms’ by Danesh Tarapore
– Abstract: The book is dedicated to evolutionary optimisation algorithms and introduces the basics and applications of evolutionary and meta-heuristic algorithms.
– Scope: in addition to various evolutionary algorithms and their applications, the book also teaches methods related to EAS.

6. ‘Optimisation Algorithms on Matrix Manifolds’ by P.-A. Absil, R. Mahony, and R. Sepulchre
– Abstract: The paper provides mathematical background and applications for optimisation using evolutionary algorithms and annealing methods. In particular, it touches on how to consider the optimisation problem in matrix space.
– Scope: useful for exploring solutions to high-dimensional and constrained optimisation problems.

コメント

タイトルとURLをコピーしました