Giter Site home page Giter Site logo

jmejia8 / metaheuristics.jl Goto Github PK

View Code? Open in Web Editor NEW
241.0 7.0 24.0 4.23 MB

High-performance metaheuristics for optimization coded purely in Julia.

Home Page: https://jmejia8.github.io/Metaheuristics.jl/stable/

License: Other

Julia 100.00%
optimization multi-objective-optimization differential-evolution simulated-annealing pso nsga2 constrained-optimization hypervolume decision-making

metaheuristics.jl's Introduction

Metaheuristics

Metaheuristics logo

High-performance metaheuristics for global optimization.

Build Status codecov Aqua QA Doc Doc DOI

Installation

Open the Julia REPL and press ] to open the Pkg prompt. To add this package, use the add command:

pkg> add Metaheuristics

Or, equivalently, via the Pkg API:

julia> import Pkg; Pkg.add("Metaheuristics")

Algorithms

Some representative metaheuristics are developed here, including those for single- and multi-objective optimization. Moreover, some constraint handling techniques have been considered in most of the implemented algorithms.

Single-Objective Optimization

  • ECA: Evolutionary Centers Algorithm
  • DE: Differential Evolution
  • PSO: Particle Swarm Optimization
  • ABC: Artificial Bee Colony
  • GSA: Gravitational Search Algorithm
  • SA: Simulated Annealing
  • WOA: Whale Optimization Algorithm
  • MCCGA: Machine-coded Compact Genetic Algorithm
  • GA: Genetic Algorithm
  • BRKGA: Biased Random Key Genetic Algorithm

Multi-Objective Optimization

SMS-EMOA in Metaheuristics.jl

  • MOEA/D-DE: Multi-objective Evolutionary Algorithm based on Decomposition
  • NSGA-II: A fast and elitist multi-objective genetic algorithm: NSGA-II
  • NSGA-III: Evolutionary Many-Objective Optimization Algorithm Using Reference-Point-Based Nondominated Sorting Approach
  • SMS-EMOA: An EMO algorithm using the hypervolume measure as the selection criterion
  • SPEA2: Improved Strength Pareto Evolutionary Algorithm
  • CCMO: Coevolutionary Framework for Constrained Multiobjective Optimization

Performance Indicators

  • GD: Generational Distance
  • IGD, IGD+: Inverted Generational Distance (Plus)
  • C-metric: Covering Indicator
  • HV: Hypervolume
  • Δₚ (Delta p): Averaged Hausdorff distance
  • Spacing Indicator
  • and more...

Multi-Criteria Decision-Making

Multi-Criteria Decision Making methods are available, including:

Quick Start

Assume you want to solve the following minimization problem.

Rastrigin Surface

Minimize:

$$f(x) = 10D + \sum_{i=1}^D x_i^2 - 10\cos(2\pi x_i)$$

where $x\in [-5, 5]^D$, that is, each coordinate in $x$ is between -5 and 5. Use $D=10$.

Solution

Firstly, import the Metaheuristics package:

using Metaheuristics

Code the objective function:

f(x) = 10length(x) + sum( x.^2 - 10cos.(2π*x)  )

Instantiate the bounds:

D = 10
bounds = boxconstraints(lb = -5ones(D), ub = 5ones(D))

Also, bounds can be a $2\times 10$ Matrix where the first row corresponds to the lower bounds whilst the second row corresponds to the upper bounds.

Approximate the optimum using the function optimize.

result = optimize(f, bounds)

Optimize returns a State datatype which contains some information about the approximation. For instance, you may use mainly two functions to obtain such an approximation.

@show minimum(result)
@show minimizer(result)

Documentation

See the documentation for more details, examples and options.

How to cite?

Please cite the package using the bibtex entry

@article{metaheuristics2022, 
  doi = {10.21105/joss.04723}, 
  url = {https://doi.org/10.21105/joss.04723}, 
  year = {2022}, 
  publisher = {The Open Journal}, 
  volume = {7}, 
  number = {78}, 
  pages = {4723}, 
  author = {Jesús-Adolfo Mejía-de-Dios and Efrén Mezura-Montes}, 
  title = {Metaheuristics: A Julia Package for Single- and Multi-Objective Optimization}, 
 journal = {Journal of Open Source Software} }

or the citation string

Mejía-de-Dios et al., (2022). Metaheuristics: A Julia Package for Single- and Multi-Objective Optimization. Journal of Open Source Software, 7(78), 4723, https://doi.org/10.21105/joss.04723

in your scientific paper if you use Metaheristics.jl.

Contributing

Please feel free to send me your PR, issue or any comment about this package for Julia.

metaheuristics.jl's People

Contributors

alecloudenback avatar daniglez avatar jbytecode avatar jmejia8 avatar jonathanfischer97 avatar longemen3000 avatar mcamilletti1 avatar pitmonticone avatar svilupp avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

metaheuristics.jl's Issues

Possible integration of MCCGA in Metaheuristics.jl

Hi,

Machine-coded Compact Genetic Algorithms (MCCGA for short) is an other single objective and real-valued optimizer introduced in

  • Satman, M. H. & Akadal, E. (2020). Machine Coded Compact Genetic Algorithms for Real Parameter Optimization Problems . Alphanumeric Journal , 8 (1) , 43-58 . DOI: 10.17093/alphanumeric.576919

and is implemented in

https://github.com/jbytecode/MCCGA

as a non-standard Julia package and for testing purposes. The algorithm first estimates the probability of having 1 for each single bit over the function domain (the term machine-coded comes from this stage). This vector of probabilities are then fed into a compact genetic algorithm. CGA directly works with the IEEE-758 bit representations of the real values. In the last step, a local optimizer is used for fine-tuning. The performance of the algorithm can be easily explored using the package above.

How about integrating this algorithm into the package Metaheuristics.jl ? Do you plan to expand the package content? Is this method suitable?

I am open for any further collaborations. Thank you in advance!

Bug in GA Implementation?

I have noticed that every other run using the GA implementation fails on a MethodError when using it to optimize the Rosenbrock function.

I noticed it when completing the example included with Optimization.jl/(OptimizationMetaheuristics.jl) package.
The following code snippet sometimes leads to the Error in question. The other times I will run into the max number of iterations.
Would be great to get some insight into where this error is coming from and whether it is reproducible on other machines.

using Metaheuristics

h(x) = (1.0 - x[1])^2 + 100. * (x[2] - x[1]^2)^2

D = 2
bounds = [-ones(D) ones(D)]'

result = Metaheuristics.optimize(h, bounds, Metaheuristics.GA())

The Error I get is the following:

ERROR: MethodError: no method matching !(::Float64)
Closest candidates are:
!(::Function) at operators.jl:1077
!(::Bool) at bool.jl:35
!(::Missing) at missing.jl:101
...
Stacktrace:
[1] _broadcast_getindex_evalf
@ .\broadcast.jl:670 [inlined]
[2] _broadcast_getindex
@ .\broadcast.jl:643 [inlined]
[3] getindex
@ .\broadcast.jl:597 [inlined]
[4] copy
@ .\broadcast.jl:899 [inlined]
[5] materialize
@ .\broadcast.jl:860 [inlined]
[6] mutation!(Q::Matrix{Float64}, parameters::BitFlipMutation)
@ Metaheuristics C:\Users\ReihsD.julia\packages\Metaheuristics\qh0JB\src\operators\mutation.jl:13
[7] update_state!(::State{Metaheuristics.xf_solution{Vector{Float64}}}, ::GA{RandomInBounds, TournamentSelection, UniformCrossover, BitFlipMutation, ElitistReplacement}, ::Problem{Float64}, ::Information, ::Options; kargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ Metaheuristics C:\Users\ReihsD.julia\packages\Metaheuristics\qh0JB\src\algorithms\GA\GA.jl:198
[8] update_state!(::State{Metaheuristics.xf_solution{Vector{Float64}}}, ::GA{RandomInBounds, TournamentSelection, UniformCrossover, BitFlipMutation, ElitistReplacement}, ::Problem{Float64}, ::Information, ::Options)
@ Metaheuristics C:\Users\ReihsD.julia\packages\Metaheuristics\qh0JB\src\algorithms\GA\GA.jl:182
[9] optimize(f::Function, bounds::LinearAlgebra.Adjoint{Float64, Matrix{Float64}}, method::Metaheuristics.Algorithm{GA{RandomInBounds, TournamentSelection, UniformCrossover, BitFlipMutation, ElitistReplacement}}; logger::Metaheuristics.var"#119#121")
@ Metaheuristics C:\Users\ReihsD.julia\packages\Metaheuristics\qh0JB\src\optimize.jl:91

Replacement of makeminmax() function in JMcDM

In previous versions of JMcDM, the MCDM functions were taking a vector of minimum() and maximum() functions as arguments. Much of the MCDM methods were defined as

method(data, weight, fns::Vector{Function})

however, this way of defining functions is not that convenient because in some cases, the type of

[minimum, minimum, minimum, ...]

is not Vector{Function} but it is Vector{typeof{minimum}) if all of the function names are the same. This is why we recently implemented and used makeminmax() function. As of the v.0.5.6 MCDM functions are now defined like

function method(
    decisionMat::DataFrame,
    weights::Array{Float64,1},
    fns::Array{F,1},
)::MethodResult where {F<:Function}

so by now, there is no need for the makeminmax() function any more and it is replaced.

Fyi.

Note: jbytecode/JMcDM@f5b4606#diff-4c97c58141fa4df965438bbc167e0c5bc82e12c5e3e6dacb76f2cb2cd68ee682

New method to provide initial solutions

Possible implementation: set_inital_solutions

# objective function
f(x) = abs(x[1]) + x[2]  + x[3]^2
# optimizer
algo  = ECA(N = 61)

# one solution
x0 = rand(3)
set_inital_solutions!(algo, x0, f)

# 30 solutions with dim 3
X0 = rand(30, 3)
set_inital_solutions!(algo, X0, f)

# 30 solutions and their fitness
P0 = [ Metaheuristics.create_child(X0[i,:], f(X0[i,:]))  for i in 1:30 ]
set_inital_solutions!(algo, P0)

optimize(f, bounds, ECA)

NSGA2 breaks for odd population size

I'm using NSGA2 for solving a multi-objective optimization and I have noticed the code breaks when using an odd value for the population size. Tracking the error, it seems the issue is in the "reproduction" function, line 255 in NSGA2.jl, where the for loop in line 261 populates the Q variable with step 2. However, when doing this while using an odd population size, in the last iteration line 274 throws a bound error. Is it anywhere specified the population size must be even value? (I also must specify I'm fairly new to the evolutionary algorithm field, so I might be wrong in this). If not, I'd suggest add a check to the initial population size input and round it to the next value.

UnicodePlots Version

There are newer versions of UnicodePlots (>= v3.0.0) and Metaheuristics.jl is not compatible with them.

Is there a specific reason for this?

Thank you in advance.

RAM issue when running large problems

When using ECA, it starts to create the population, with a large problem with 30.000 parameters, and my RAM (128GB) is fully used, killing the process. Do you know a way to limit the ram usage?

Rgds

Ideas for v4.0

Some ideas for v4

Implement Flux-inspired algorithms
Example:

# algorithm
GA = Chain(tournament(), SBX(p=0.9), PM(p=0.1), Evironmental(base=:nsga2))


# usage
P = InitialPopulation(N=100)
GA(P) # produces a new population

## optimize
for gen in 1:100
    P = GA(P)
end

JOSS Review Tracker

Hey,

I am currently working on my review for JOSS on the paper you submitted. In this issue, I will collect all the things I would like to discuss with you, so we can keep all in one place.

In general, I think the work is very well-thought-out and only minor things remain. You obviously put a lot of effort in and produced a high quality project.

  • Consider an example with more reproducible results. I ran the example you show both in your paper and the repo and the optimization does not always find the 0 solution (I tried with 1000 optimization runs and only in ~50% of them I found the true minimum). To make it more intuitive for new people, maybe you can find an example that more reliably produces the optimal result? Alternatively, you could add a small note with an explanation.
  • Discuss other Julia packages for optimization and how your work is different and how it integrates with them and the larger ecosystem.
  • Please run a quick spell check, some minor errors and unusual formulations still exist.

Once these topics here are resolved, I can finish my review and recommend publication.

Cheers,
Paul

Printing progress to terminal when running optimization?

Hi,
I was wondering if there are any ways to display to the terminal the information during the optimization (something like a verbosity clause, e.g. the message you'd get when running pygmo: https://esa.github.io/pygmo2/tutorials/nlopt_basics.html?highlight=verbosity).
It could also maybe be the option to include a user-defined callback function called at each iteration, where the function can access the status of the algorithm or something.
Apologies if I just missed it, but I couldn't find it in the documentation/examples...
Thanks!

User-defined random number generator

Thanks for this amazing package. It would be nice if every rand call inside the package used a random number generator rng that can be passed by the user. This allows maximum reproducibility of results instead of relying on Julia's global random number generator.

problem

How to use NSGA2 with batch Evaluation

How to define mixed-integer optimization problems?

Possible implementation (?):

search_space = SearchSpace(bounds_int, bounds_float)

f(variables::Variables) = sum((variables.discrete - variables.continuous) ^ 2)

results = optimize(f, search_space, OPTIMIZER)

Could Variables be a Dict?

Multi-threading fitness calculations

The vast majority of the calculation times is based on fitness calculations in optimization algorithms.

For example, in a GA, the operators selection, crossover, and mutation are not heavy or costy calculations, however, calculation of a fitness value can take several seconds - in such cases, it can take minutes.

How about calculating fitness values in different threads? The Threads.thread macro would be a solution. ThreadsX package presents a multi-threaded version of map().

Letting m() a multi-threading map() function

m(f, population)

would increase the computation time with its easy implementation.

This issue is only a recommendation, any thoughts are welcome.

example fails

running


using Metaheuristics

n = 32
xs = rand(n)
ys = @. 1 / (1 + exp(-xs))

F(x,y) = @. sum(x^2 + y^2)
f(x,y) = @. sum((x - y)^2)

bounds = [ -5 -5 -5; 5 5 5.0]

Ψ(x) = minimizer( optimize(y -> f(x,y), bounds) )

result = minimizer( optimize(x -> F(x, Ψ(x)), bounds) )

gives

ERROR: MethodError: no method matching create_child(::Vector{Float64}, ::Vector{Float64}; ε::Float64)

Closest candidates are:
  create_child(::AbstractMatrix, ::AbstractVector; ε)
   @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:161
  create_child(::Any, ::Float64; ε)
   @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:133
  create_child(::Any, ::Tuple{Float64, Vector{Float64}, Vector{Float64}}; ε)
   @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:138
  ...

Stacktrace:
  [1] create_solution(x::Vector{Float64}, problem::Problem{Float64}; ε::Float64)
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:320
  [2] (::Metaheuristics.var"#26#27"{Float64, Problem{Float64}, Matrix{Float64}})(i::Int64)
    @ Metaheuristics .\none:0
  [3] iterate
    @ .\generator.jl:47 [inlined]
  [4] collect(itr::Base.Generator{UnitRange{Int64}, Metaheuristics.var"#26#27"{Float64, Problem{Float64}, Matrix{Float64}}})
    @ Base .\array.jl:782
  [5] #generate_population#25
    @ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:260 [inlined]
  [6] gen_initial_state(problem::Problem{Float64}, parameters::ECA, information::Information, options::Options)
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\algorithm.jl:26
  [7] gen_initial_state
    @ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\algorithm.jl:8 [inlined]
  [8] initialize!(::State{Any}, ::ECA, ::Problem{Float64}, ::Information, ::Options; kargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\ECA\ECA.jl:268
  [9] initialize!(::State{Any}, ::ECA, ::Problem{Float64}, ::Information, ::Options)
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\ECA\ECA.jl:231
 [10] optimize(f::Function, bounds::Matrix{Float64}, method::Metaheuristics.Algorithm{ECA}; logger::Metaheuristics.var"#112#114")
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:63
 [11] optimize
    @ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:36 [inlined]
 [12] optimize(f::Function, bounds::Matrix{Float64})
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:36
 [13] Ψ(x::Vector{Float64})
    @ Main .\REPL[166]:1
 [14] (::var"#19#20")(x::Vector{Float64})
    @ Main .\REPL[167]:1
 [15] create_solution(x::Vector{Float64}, problem::Problem{Float64}; ε::Float64)
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:320
 [16] (::Metaheuristics.var"#26#27"{Float64, Problem{Float64}, Matrix{Float64}})(i::Int64)
    @ Metaheuristics .\none:0
 [17] iterate
    @ .\generator.jl:47 [inlined]
 [18] collect(itr::Base.Generator{UnitRange{Int64}, Metaheuristics.var"#26#27"{Float64, Problem{Float64}, Matrix{Float64}}})
    @ Base .\array.jl:782
 [19] #generate_population#25
    @ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:260 [inlined]
 [20] gen_initial_state(problem::Problem{Float64}, parameters::ECA, information::Information, options::Options)
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\algorithm.jl:26
 [21] gen_initial_state
    @ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\algorithm.jl:8 [inlined]
 [22] initialize!(::State{Any}, ::ECA, ::Problem{Float64}, ::Information, ::Options; kargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\ECA\ECA.jl:268
 [23] initialize!(::State{Any}, ::ECA, ::Problem{Float64}, ::Information, ::Options)
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\ECA\ECA.jl:231
 [24] optimize(f::Function, bounds::Matrix{Float64}, method::Metaheuristics.Algorithm{ECA}; logger::Metaheuristics.var"#112#114")
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:63
 [25] optimize
    @ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:36 [inlined]
 [26] optimize(f::Function, bounds::Matrix{Float64})
    @ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:36
 [27] top-level scope
    @ REPL[167]:1

Modify a metaheuristic

A modified metaheuristic can extend the capability of an implemented algorithm.
The idea is to define custom components (initialization, termination criterion, etc) without overwriting default methods.

For instance:

eca = ECA(N = 100)
eca_modified = modify(eca, termination = RelativeTolerance())
optimize(f, bounds, eca_modified)

Registered package install broken.

This is a totally awesome package.

I can install it using Pkg.add(url="...github...") etc.

But I can't install it using:

julia> import Pkg; Pkg.add("Metaheuristics")

In Julia Version 1.5.3 (2020-11-09), which is a bummer, because I am trying to make another package that depends on it.

Was this momentarily a registered package? Perhaps this is somehow just on my end?


ERROR: The following package names could not be resolved:
 * Metaheuristics (not found in project, manifest or registry)

Stacktrace:
 [1] pkgerror(::String) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Types.jl:52
 [2] ensure_resolved(::Pkg.Types.Context, ::Array{Pkg.Types.PackageSpec,1}; registry::Bool) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Types.jl:837
 [3] add(::Pkg.Types.Context, ::Array{Pkg.Types.PackageSpec,1}; preserve::Pkg.Types.PreserveLevel, platform::Pkg.BinaryPlatforms.Linux, kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:177
 [4] add(::Pkg.Types.Context, ::Array{Pkg.Types.PackageSpec,1}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:139
 [5] #add#21 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:67 [inlined]
 [6] add at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:67 [inlined]
 [7] #add#20 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:66 [inlined]
 [8] add at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:66 [inlined]
 [9] add(::String; kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:65
 [10] add(::String) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:65
 [11] top-level scope at REPL[4]:1

Implement `optimize!(args...)`

Implement:

# objective function
f(x) = sum(x.^2)
bounds = [-10 -10 -10; 10 10 10.0]
# optimization method
method = ECA()
# main optimization loop
for iteration in 1:10
    optimize!(f, bounds, method)
end
result = get_result(method)

Another example:

method = ECA()
# main optimization loop
while !should_stop(method)
    optimize!(f, bounds, method)
end
result = get_result(method)

Solving combinatorial problems using SA [DUPLICATED]

I have seen simulated annealing commonly used to optimize combinatorial problems such as TSP.
However, I can't seem to find a way to use SA as implemented in this package for combinatorial problems, since it seems to accept only a real-valued function and exposes no way to control how new solutions are generated.

Is it possible to use the SA implementation in this package for combinatorial problems and if so, then how?

Update `optimize` API

The optimize API will be extended mainly to:

  • Handle starting points.
  • Define complex search spaces (e.g., mixed-integer variables)

Starting Solutions

# example 1
optimize(f, [0.1, 0.3, -1.5], OPTIMIZER)
# example 2
optimize(f, StartingSolution([0.1, 0.3, -1.5]), OPTIMIZER)

Search Space

# current implementation
optimize(f, bounds, OPTIMIZER)
# new
optimize(f, SearchSpace(...), OPTIMIZER)

The search space can be defined as follows:

SearchSpace(bounds)
SearchSpace(Bool, 10) # Bit arrays with size 10
SearchSpace(integer_variables=bounds_int, real_variables=bounds_float) # for mixed-integer problems. 
SearchSpace(:x => [-1.0, 1.0], :y => [0, 100]) #  other possible application

Replacing Nelder-Mead with Hooke-Jeeves in MCCGA

I am going to send a pull request that replaces Nelder-Mead local search part with Hooke-Jeeves in MCCGA algorithm.

This PR gains these items:

  • Reduces the dependencies of outer packages
  • Hooke-Jeeves is not only a local search algorithm but is a direct-search algorithm
  • Hooke-Jeeves does not use derivatives

Add Co-evolutionary Framework

Possible features:

  1. Define objective (subjective) functions for each sub-population
  2. Migration scheme

Example

Working with 3 sub-populations:

# objective function
f(x) = sum(10x .^ 2 .- sin.(x)/10)
bounds = [-10ones(5)  10ones(5)]

# subjective functions
f1(x) = f(x)
f2(x) = f(x)
f3(x) = f(x)

# migration scheme (copy all individuals to pop1)
migration(pop1, pop2, pop3) = append!(pop1, pop2, pop3)

# Three optimizers for each sub-pop
cc = Coevo(ECA(), DE(), PSO(); migration_scheme = migration, fitness=[f1, f2, f3])


optimize(f, bounds, cc)

Ref: Potter, M. and De Jong, K., 2001, Cooperative Coevolution: An Architecture for Evolving Co-adapted Subcomponents.

Solving combinatorial problems using SA

I have seen simulated annealing commonly used to optimize combinatorial problems such as TSP.
However, I can't seem to find a way to use SA as implemented in this package for combinatorial problems, since it seems to accept only a real-valued function and exposes no way to control how new solutions are generated.

Is it possible to use the SA implementation in this package for combinatorial problems and if so, then how?

Error in GA global optimisation

Hi,
I am working on an optimization problem in 3 dimensions. I have been using the Particle Swarm Optimisation and it was executing without any issues. Having a very similar syntax to PSO, I wanted to start using GA. My first attemp I obtained the following error:

MethodError: no method matching (::var"#objective_function#41")(::Vector{Float64}, ::Vector{Float64}, ::Vector{Float64})
Stacktrace:
  [1] (::var"#to_optim#42"{Matrix{Float64}, var"#objective_function#41"})(theta::Vector{Float64})
    @ Main \ga_optim.jl:135
  [2] create_solution(x::Vector{Float64}, problem::Problem{Float64}; e::Float64)
    @ Metaheuristics \packages\Metaheuristics\xnKGn\src\solutions\individual.jl:320
  [3] (::Metaheuristics.var"#32#33"{Float64, Problem{Float64}, Matrix{Float64}})(i::Int64)
    @ Metaheuristics .\none:0
  [4] iterate
    @ .\generator.jl:47 [inlined]
  [5] collect(itr::Base.Generator{UnitRange{Int64}, Metaheuristics.var"#32#33"{Float64, Problem{Float64}, Matrix{Float64}}})
    @ Base .\array.jl:724
  [6] #generate_population#31
    @ \packages\Metaheuristics\xnKGn\src\solutions\individual.jl:260 [inlined]
  [7] gen_initial_

Double checking my work, I ran the code again a number of times. Confusingly, this error arises about 50% of the time, when using default GA parameters. I have discovered that using a lower population size reduces that to about 10% while increasing the population leads to almost 100%. I do not understand the error, as I am providing the right type for each function input. It must be some conversion error within the optimisation sequence. I would really appreciate any help with this.

Kind regards!

local search step omitted in MCCGA

Let's suppose we have that scenario

julia> using Metaheuristics

julia> myf(x) = abs(x[1]-pi) + abs(x[2]-exp(1))
myf (generic function with 1 method)

julia> bounds = [-50.0 -50; 50 50]
2×2 Matrix{Float64}:
 -50.0  -50.0
  50.0   50.0

julia> result = optimize(myf, bounds, MCCGA(use_local_search=true))
+=========== RESULT ==========+
  iteration: 2588
    minimum: 0.017063
  minimizer: [3.124997854232788, 2.71875]
    f calls: 5175
 total time: 0.0736 s
stop reason: Other stopping criteria.
+============================+

I have tried that example without the package Optim installed. It is supposed the library to warn "The package Optim must be installed" but it silently report a premature converged result. When we import Optim we get a better solution:

julia> import Optim

julia> result = optimize(myf, bounds, MCCGA(use_local_search=true))
+=========== RESULT ==========+
  iteration: 2715
    minimum: 1.53782e-08
  minimizer: [3.1415926408130987, 2.7182818258575887]
    f calls: 5540
 total time: 1.8034 s
stop reason: Other stopping criteria.
+============================+

The function uses Optim if it is available, however, if it is not the case, it silently reports the solution obtained in previous stages.

Save intermediate populations to file to enable restarting

This is a feature request and I'm not sure whether you're interested in including it in your package

For optimizing heavy-ish functions, sometimes computer stuff happens. It would be great to have an easy API for saving intermediate populations to file so that if computation is interrupted, it's possible to pick up from about where it left off.

I think this should be pretty easy to do and I'm happy to try to help out with a first stab PR (may take a while, next few weeks are busy, and will probably need some additional work) but wanted to check in on what that API and implementation might look like.

API: this seems like it might go in Options(). I'm not sure what the keyword arguments should be named, but it seems like there should be an option to save results every N iterations (defaults to Inf/missing/similar that gives current behavior). Additionally, I think making the user define the filename to use for caching would make sense.

Implementation: Every N iterations, use JDL2 to save the result from optimize. optimize would need to check for the existence of an existing file -- to load in from file, an approach similar to that given in https://jmejia8.github.io/Metaheuristics.jl/stable/examples/#Providing-Initial-Solutions should work.

Add Decision-Making methods

This new feature will include a posteriori decision-making.

Possible Usage Example:

julia> res = optimize(f, bounds, NSGA2())

julia> idx = decisionmaking(res, method) # or

julia> idx = decisionmaking(res.population, method) # or

julia> idx = decisionmaking(pareto_front(res), method) # or

julia> idx = method(res) # optional

decision making will return an integer vector containing solutions in res.population related to that decision suggested by the method. Possible alias for decisionmaking: mcdm.

Consider interfacing https://github.com/jbytecode/JMcDM

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Tests for multi-objectives fail

After performing

(Metaheuristics) pkg>  test

I get

Multi objective: Test Failed at /Users/istar1/Desktop/code/julia/Metaheuristics.jl/test/multi-objective.jl:47
  Expression: Metaheuristics.PerformanceIndicators.igd(result.population, pf) <= 0.2
   Evaluated: 0.22673908824247319 <= 0.2
Stacktrace:
 [1] macro expansion
   @ ~/Julia/julia/usr/share/julia/stdlib/v1.9/Test/src/Test.jl:477 [inlined]
 [2] (::var"#run_methods#20")(problem::Symbol)
   @ Main ~/Desktop/code/julia/Metaheuristics.jl/test/multi-objective.jl:47

Is 0.2267 reasonable or the right result? How about changing the threshold or using a RNG to generate an exact solution in each time?

Thanks.

Improve error reporting

Improve reporting errors when the output of the objective function is not appropriate.
To avoid similar problems like #64

Add unary epsilon-indicator

Implement the unary epsilon-indicator detailed in

Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C. M., & da Fonseca, V. G. (2003). Performance assessment of multiobjective optimizers: an analysis and review. IEEE Transactions on Evolutionary Computation, 7(2), 117–132. doi:10.1109/TEVC.2003.810758 (https://doi.org/10.1109/TEVC.2003.810758)

Single objective constrained parallel optimization cannot work?

I want to experiment single objective parallel optimization with constrains. I modified the example from https://docs.juliahub.com/Metaheuristics/aJ70z/3.2.0/examples/#Constrained-Optimization.

The code is below.

using Metaheuristics
Metaoptimize = Metaheuristics.optimize
MetaOptions = Metaheuristics.Options

function f(x)
x,y = x[1], x[2]
fx = (1-x)^2+100(y-x^2)^2
gx = [x^2 + y^2 - 2] # inequality constraints
hx = [0.0] # equality constraints
# order is important
return fx, gx, hx
end

function fff(X)
N = size(X,1)
fx, gx, hx = zeros(N), zeros(N,1), zeros(N,1)
Threads.@threads for i in 1:size(X,1)
fx[i], gx[i], hx[i] = f(X[i,:])
end
fx, gx, hx
end

bounds = [-2.0 -2; 2 2]

options = MetaOptions(f_calls_limit = 20000, parallel_evaluation=true, debug=true);

Metaoptimize(fff, bounds, ECA(;options, N=30, K=3))

I got the error 'TaskFailedException'.

Any advice? Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.