jmejia8 / metaheuristics.jl Goto Github PK
View Code? Open in Web Editor NEWHigh-performance metaheuristics for optimization coded purely in Julia.
Home Page: https://jmejia8.github.io/Metaheuristics.jl/stable/
License: Other
High-performance metaheuristics for optimization coded purely in Julia.
Home Page: https://jmejia8.github.io/Metaheuristics.jl/stable/
License: Other
There are newer versions of UnicodePlots (>= v3.0.0) and Metaheuristics.jl is not compatible with them.
Is there a specific reason for this?
Thank you in advance.
I have seen simulated annealing commonly used to optimize combinatorial problems such as TSP.
However, I can't seem to find a way to use SA as implemented in this package for combinatorial problems, since it seems to accept only a real-valued function and exposes no way to control how new solutions are generated.
Is it possible to use the SA implementation in this package for combinatorial problems and if so, then how?
Hi there,
thanks for the work on this great package. I have a question rather than an actual problem with the package.
I couldn't figure out how to pass a pre-calculated distance matrix to the function f(x), could someone give me a hint on that matter?
This journal paper can be followed for implementing new algorithms:
https://ieeexplore.ieee.org/abstract/document/9344597
After implementing some crucial methods, Metaheuristics.jl can also be used in a novel survey article that measures the performances of the algorithms using a set of well-known test functions.
Hi,
Do you have any plans to integrate the package with SnoopPrecompile?
It drastically reduces the time-to-first call methods.
Here is what I did for LinRegOutliers: https://github.com/jbytecode/LinRegOutliers/blob/master/src/precompile/precompile.jl
Best.
Improve reporting errors when the output of the objective function is not appropriate.
To avoid similar problems like #64
I'm using NSGA2 for solving a multi-objective optimization and I have noticed the code breaks when using an odd value for the population size. Tracking the error, it seems the issue is in the "reproduction" function, line 255 in NSGA2.jl, where the for loop in line 261 populates the Q variable with step 2. However, when doing this while using an odd population size, in the last iteration line 274 throws a bound error. Is it anywhere specified the population size must be even value? (I also must specify I'm fairly new to the evolutionary algorithm field, so I might be wrong in this). If not, I'd suggest add a check to the initial population size input and round it to the next value.
When using ECA, it starts to create the population, with a large problem with 30.000 parameters, and my RAM (128GB) is fully used, killing the process. Do you know a way to limit the ram usage?
Rgds
It could be interesting to know which points are ignored (e.g. the %) before computing the HV
Possible implementation: set_inital_solutions
# objective function
f(x) = abs(x[1]) + x[2] + x[3]^2
# optimizer
algo = ECA(N = 61)
# one solution
x0 = rand(3)
set_inital_solutions!(algo, x0, f)
# 30 solutions with dim 3
X0 = rand(30, 3)
set_inital_solutions!(algo, X0, f)
# 30 solutions and their fitness
P0 = [ Metaheuristics.create_child(X0[i,:], f(X0[i,:])) for i in 1:30 ]
set_inital_solutions!(algo, P0)
optimize(f, bounds, ECA)
Thanks for this amazing package. It would be nice if every rand
call inside the package used a random number generator rng
that can be passed by the user. This allows maximum reproducibility of results instead of relying on Julia's global random number generator.
The vast majority of the calculation times is based on fitness calculations in optimization algorithms.
For example, in a GA, the operators selection, crossover, and mutation are not heavy or costy calculations, however, calculation of a fitness value can take several seconds - in such cases, it can take minutes.
How about calculating fitness values in different threads? The Threads.thread
macro would be a solution. ThreadsX
package presents a multi-threaded version of map().
Letting m() a multi-threading map() function
m(f, population)
would increase the computation time with its easy implementation.
This issue is only a recommendation, any thoughts are welcome.
Hi,
I was wondering if there are any ways to display to the terminal the information during the optimization (something like a verbosity clause, e.g. the message you'd get when running pygmo: https://esa.github.io/pygmo2/tutorials/nlopt_basics.html?highlight=verbosity).
It could also maybe be the option to include a user-defined callback function called at each iteration, where the function can access the status of the algorithm or something.
Apologies if I just missed it, but I couldn't find it in the documentation/examples...
Thanks!
Implement the unary epsilon-indicator detailed in
Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C. M., & da Fonseca, V. G. (2003). Performance assessment of multiobjective optimizers: an analysis and review. IEEE Transactions on Evolutionary Computation, 7(2), 117–132. doi:10.1109/TEVC.2003.810758 (https://doi.org/10.1109/TEVC.2003.810758)
After performing
(Metaheuristics) pkg> test
I get
Multi objective: Test Failed at /Users/istar1/Desktop/code/julia/Metaheuristics.jl/test/multi-objective.jl:47
Expression: Metaheuristics.PerformanceIndicators.igd(result.population, pf) <= 0.2
Evaluated: 0.22673908824247319 <= 0.2
Stacktrace:
[1] macro expansion
@ ~/Julia/julia/usr/share/julia/stdlib/v1.9/Test/src/Test.jl:477 [inlined]
[2] (::var"#run_methods#20")(problem::Symbol)
@ Main ~/Desktop/code/julia/Metaheuristics.jl/test/multi-objective.jl:47
Is 0.2267 reasonable or the right result? How about changing the threshold or using a RNG to generate an exact solution in each time?
Thanks.
Possible implementation (?):
search_space = SearchSpace(bounds_int, bounds_float)
f(variables::Variables) = sum((variables.discrete - variables.continuous) ^ 2)
results = optimize(f, search_space, OPTIMIZER)
Could Variables
be a Dict
?
Possible features:
Working with 3 sub-populations:
# objective function
f(x) = sum(10x .^ 2 .- sin.(x)/10)
bounds = [-10ones(5) 10ones(5)]
# subjective functions
f1(x) = f(x)
f2(x) = f(x)
f3(x) = f(x)
# migration scheme (copy all individuals to pop1)
migration(pop1, pop2, pop3) = append!(pop1, pop2, pop3)
# Three optimizers for each sub-pop
cc = Coevo(ECA(), DE(), PSO(); migration_scheme = migration, fitness=[f1, f2, f3])
optimize(f, bounds, cc)
Ref: Potter, M. and De Jong, K., 2001, Cooperative Coevolution: An Architecture for Evolving Co-adapted Subcomponents.
Hi,
Machine-coded Compact Genetic Algorithms (MCCGA for short) is an other single objective and real-valued optimizer introduced in
and is implemented in
https://github.com/jbytecode/MCCGA
as a non-standard Julia package and for testing purposes. The algorithm first estimates the probability of having 1 for each single bit over the function domain (the term machine-coded comes from this stage). This vector of probabilities are then fed into a compact genetic algorithm. CGA directly works with the IEEE-758 bit representations of the real values. In the last step, a local optimizer is used for fine-tuning. The performance of the algorithm can be easily explored using the package above.
How about integrating this algorithm into the package Metaheuristics.jl ? Do you plan to expand the package content? Is this method suitable?
I am open for any further collaborations. Thank you in advance!
For any optimization algorithms, we expect to show the fitness value for each iteration. How could we do that for Metaheuristics.jl? I tried to find in Options but I cannot find it. Thank you.
this method is about finding weights, not ranking and ordering, so it will be removed from the list. (also in jmcdm)
I have seen simulated annealing commonly used to optimize combinatorial problems such as TSP.
However, I can't seem to find a way to use SA as implemented in this package for combinatorial problems, since it seems to accept only a real-valued function and exposes no way to control how new solutions are generated.
Is it possible to use the SA implementation in this package for combinatorial problems and if so, then how?
Hey,
I am currently working on my review for JOSS on the paper you submitted. In this issue, I will collect all the things I would like to discuss with you, so we can keep all in one place.
In general, I think the work is very well-thought-out and only minor things remain. You obviously put a lot of effort in and produced a high quality project.
0
solution (I tried with 1000 optimization runs and only in ~50% of them I found the true minimum). To make it more intuitive for new people, maybe you can find an example that more reliably produces the optimal result? Alternatively, you could add a small note with an explanation.Once these topics here are resolved, I can finish my review and recommend publication.
Cheers,
Paul
This is a feature request and I'm not sure whether you're interested in including it in your package
For optimizing heavy-ish functions, sometimes computer stuff happens. It would be great to have an easy API for saving intermediate populations to file so that if computation is interrupted, it's possible to pick up from about where it left off.
I think this should be pretty easy to do and I'm happy to try to help out with a first stab PR (may take a while, next few weeks are busy, and will probably need some additional work) but wanted to check in on what that API and implementation might look like.
API: this seems like it might go in Options()
. I'm not sure what the keyword arguments should be named, but it seems like there should be an option to save results every N
iterations (defaults to Inf/missing/similar that gives current behavior). Additionally, I think making the user define the filename to use for caching would make sense.
Implementation: Every N
iterations, use JDL2 to save the result from optimize
. optimize
would need to check for the existence of an existing file -- to load in from file, an approach similar to that given in https://jmejia8.github.io/Metaheuristics.jl/stable/examples/#Providing-Initial-Solutions should work.
Parallel computing is not supported,
I have noticed that every other run using the GA implementation fails on a MethodError when using it to optimize the Rosenbrock function.
I noticed it when completing the example included with Optimization.jl/(OptimizationMetaheuristics.jl) package.
The following code snippet sometimes leads to the Error in question. The other times I will run into the max number of iterations.
Would be great to get some insight into where this error is coming from and whether it is reproducible on other machines.
using Metaheuristics
h(x) = (1.0 - x[1])^2 + 100. * (x[2] - x[1]^2)^2
D = 2
bounds = [-ones(D) ones(D)]'
result = Metaheuristics.optimize(h, bounds, Metaheuristics.GA())
The Error I get is the following:
ERROR: MethodError: no method matching !(::Float64)
Closest candidates are:
!(::Function) at operators.jl:1077
!(::Bool) at bool.jl:35
!(::Missing) at missing.jl:101
...
Stacktrace:
[1] _broadcast_getindex_evalf
@ .\broadcast.jl:670 [inlined]
[2] _broadcast_getindex
@ .\broadcast.jl:643 [inlined]
[3] getindex
@ .\broadcast.jl:597 [inlined]
[4] copy
@ .\broadcast.jl:899 [inlined]
[5] materialize
@ .\broadcast.jl:860 [inlined]
[6] mutation!(Q::Matrix{Float64}, parameters::BitFlipMutation)
@ Metaheuristics C:\Users\ReihsD.julia\packages\Metaheuristics\qh0JB\src\operators\mutation.jl:13
[7] update_state!(::State{Metaheuristics.xf_solution{Vector{Float64}}}, ::GA{RandomInBounds, TournamentSelection, UniformCrossover, BitFlipMutation, ElitistReplacement}, ::Problem{Float64}, ::Information, ::Options; kargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ Metaheuristics C:\Users\ReihsD.julia\packages\Metaheuristics\qh0JB\src\algorithms\GA\GA.jl:198
[8] update_state!(::State{Metaheuristics.xf_solution{Vector{Float64}}}, ::GA{RandomInBounds, TournamentSelection, UniformCrossover, BitFlipMutation, ElitistReplacement}, ::Problem{Float64}, ::Information, ::Options)
@ Metaheuristics C:\Users\ReihsD.julia\packages\Metaheuristics\qh0JB\src\algorithms\GA\GA.jl:182
[9] optimize(f::Function, bounds::LinearAlgebra.Adjoint{Float64, Matrix{Float64}}, method::Metaheuristics.Algorithm{GA{RandomInBounds, TournamentSelection, UniformCrossover, BitFlipMutation, ElitistReplacement}}; logger::Metaheuristics.var"#119#121")
@ Metaheuristics C:\Users\ReihsD.julia\packages\Metaheuristics\qh0JB\src\optimize.jl:91
Implement
Ref: Takahama, T., & Sakai, S. (n.d.). Constrained Optimization by the ε Constrained Differential Evolution with Gradient-Based Mutation and Feasible Elites. 2006 IEEE International Conference on Evolutionary Computation. doi:10.1109/CEC.2006.1688283 (https://doi.org/10.1109/CEC.2006.1688283)
The optimize
API will be extended mainly to:
# example 1
optimize(f, [0.1, 0.3, -1.5], OPTIMIZER)
# example 2
optimize(f, StartingSolution([0.1, 0.3, -1.5]), OPTIMIZER)
# current implementation
optimize(f, bounds, OPTIMIZER)
# new
optimize(f, SearchSpace(...), OPTIMIZER)
The search space can be defined as follows:
SearchSpace(bounds)
SearchSpace(Bool, 10) # Bit arrays with size 10
SearchSpace(integer_variables=bounds_int, real_variables=bounds_float) # for mixed-integer problems.
SearchSpace(:x => [-1.0, 1.0], :y => [0, 100]) # other possible application
I want to experiment single objective parallel optimization with constrains. I modified the example from https://docs.juliahub.com/Metaheuristics/aJ70z/3.2.0/examples/#Constrained-Optimization.
The code is below.
using Metaheuristics
Metaoptimize = Metaheuristics.optimize
MetaOptions = Metaheuristics.Options
function f(x)
x,y = x[1], x[2]
fx = (1-x)^2+100(y-x^2)^2
gx = [x^2 + y^2 - 2] # inequality constraints
hx = [0.0] # equality constraints
# order is important
return fx, gx, hx
end
function fff(X)
N = size(X,1)
fx, gx, hx = zeros(N), zeros(N,1), zeros(N,1)
Threads.@threads for i in 1:size(X,1)
fx[i], gx[i], hx[i] = f(X[i,:])
end
fx, gx, hx
end
bounds = [-2.0 -2; 2 2]
options = MetaOptions(f_calls_limit = 20000, parallel_evaluation=true, debug=true);
Metaoptimize(fff, bounds, ECA(;options, N=30, K=3))
I got the error 'TaskFailedException'.
Any advice? Thank you.
change to:
isnothing(indnext) && (indnext = length(population))
Same in
How to use NSGA2 with batch Evaluation
running
using Metaheuristics
n = 32
xs = rand(n)
ys = @. 1 / (1 + exp(-xs))
F(x,y) = @. sum(x^2 + y^2)
f(x,y) = @. sum((x - y)^2)
bounds = [ -5 -5 -5; 5 5 5.0]
Ψ(x) = minimizer( optimize(y -> f(x,y), bounds) )
result = minimizer( optimize(x -> F(x, Ψ(x)), bounds) )
gives
ERROR: MethodError: no method matching create_child(::Vector{Float64}, ::Vector{Float64}; ε::Float64)
Closest candidates are:
create_child(::AbstractMatrix, ::AbstractVector; ε)
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:161
create_child(::Any, ::Float64; ε)
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:133
create_child(::Any, ::Tuple{Float64, Vector{Float64}, Vector{Float64}}; ε)
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:138
...
Stacktrace:
[1] create_solution(x::Vector{Float64}, problem::Problem{Float64}; ε::Float64)
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:320
[2] (::Metaheuristics.var"#26#27"{Float64, Problem{Float64}, Matrix{Float64}})(i::Int64)
@ Metaheuristics .\none:0
[3] iterate
@ .\generator.jl:47 [inlined]
[4] collect(itr::Base.Generator{UnitRange{Int64}, Metaheuristics.var"#26#27"{Float64, Problem{Float64}, Matrix{Float64}}})
@ Base .\array.jl:782
[5] #generate_population#25
@ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:260 [inlined]
[6] gen_initial_state(problem::Problem{Float64}, parameters::ECA, information::Information, options::Options)
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\algorithm.jl:26
[7] gen_initial_state
@ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\algorithm.jl:8 [inlined]
[8] initialize!(::State{Any}, ::ECA, ::Problem{Float64}, ::Information, ::Options; kargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\ECA\ECA.jl:268
[9] initialize!(::State{Any}, ::ECA, ::Problem{Float64}, ::Information, ::Options)
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\ECA\ECA.jl:231
[10] optimize(f::Function, bounds::Matrix{Float64}, method::Metaheuristics.Algorithm{ECA}; logger::Metaheuristics.var"#112#114")
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:63
[11] optimize
@ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:36 [inlined]
[12] optimize(f::Function, bounds::Matrix{Float64})
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:36
[13] Ψ(x::Vector{Float64})
@ Main .\REPL[166]:1
[14] (::var"#19#20")(x::Vector{Float64})
@ Main .\REPL[167]:1
[15] create_solution(x::Vector{Float64}, problem::Problem{Float64}; ε::Float64)
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:320
[16] (::Metaheuristics.var"#26#27"{Float64, Problem{Float64}, Matrix{Float64}})(i::Int64)
@ Metaheuristics .\none:0
[17] iterate
@ .\generator.jl:47 [inlined]
[18] collect(itr::Base.Generator{UnitRange{Int64}, Metaheuristics.var"#26#27"{Float64, Problem{Float64}, Matrix{Float64}}})
@ Base .\array.jl:782
[19] #generate_population#25
@ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\solutions\individual.jl:260 [inlined]
[20] gen_initial_state(problem::Problem{Float64}, parameters::ECA, information::Information, options::Options)
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\algorithm.jl:26
[21] gen_initial_state
@ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\algorithm.jl:8 [inlined]
[22] initialize!(::State{Any}, ::ECA, ::Problem{Float64}, ::Information, ::Options; kargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\ECA\ECA.jl:268
[23] initialize!(::State{Any}, ::ECA, ::Problem{Float64}, ::Information, ::Options)
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\algorithms\ECA\ECA.jl:231
[24] optimize(f::Function, bounds::Matrix{Float64}, method::Metaheuristics.Algorithm{ECA}; logger::Metaheuristics.var"#112#114")
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:63
[25] optimize
@ C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:36 [inlined]
[26] optimize(f::Function, bounds::Matrix{Float64})
@ Metaheuristics C:\Users\MrJSa\.julia\packages\Metaheuristics\rpBDB\src\optimize.jl:36
[27] top-level scope
@ REPL[167]:1
Implement:
# objective function
f(x) = sum(x.^2)
bounds = [-10 -10 -10; 10 10 10.0]
# optimization method
method = ECA()
# main optimization loop
for iteration in 1:10
optimize!(f, bounds, method)
end
result = get_result(method)
Another example:
method = ECA()
# main optimization loop
while !should_stop(method)
optimize!(f, bounds, method)
end
result = get_result(method)
In previous versions of JMcDM, the MCDM functions were taking a vector of minimum()
and maximum()
functions as arguments. Much of the MCDM methods were defined as
method(data, weight, fns::Vector{Function})
however, this way of defining functions is not that convenient because in some cases, the type of
[minimum, minimum, minimum, ...]
is not Vector{Function}
but it is Vector{typeof{minimum})
if all of the function names are the same. This is why we recently implemented and used makeminmax()
function. As of the v.0.5.6 MCDM functions are now defined like
function method(
decisionMat::DataFrame,
weights::Array{Float64,1},
fns::Array{F,1},
)::MethodResult where {F<:Function}
so by now, there is no need for the makeminmax()
function any more and it is replaced.
Fyi.
Note: jbytecode/JMcDM@f5b4606#diff-4c97c58141fa4df965438bbc167e0c5bc82e12c5e3e6dacb76f2cb2cd68ee682
Let's suppose we have that scenario
julia> using Metaheuristics
julia> myf(x) = abs(x[1]-pi) + abs(x[2]-exp(1))
myf (generic function with 1 method)
julia> bounds = [-50.0 -50; 50 50]
2×2 Matrix{Float64}:
-50.0 -50.0
50.0 50.0
julia> result = optimize(myf, bounds, MCCGA(use_local_search=true))
+=========== RESULT ==========+
iteration: 2588
minimum: 0.017063
minimizer: [3.124997854232788, 2.71875]
f calls: 5175
total time: 0.0736 s
stop reason: Other stopping criteria.
+============================+
I have tried that example without the package Optim installed. It is supposed the library to warn "The package Optim must be installed" but it silently report a premature converged result. When we import Optim we get a better solution:
julia> import Optim
julia> result = optimize(myf, bounds, MCCGA(use_local_search=true))
+=========== RESULT ==========+
iteration: 2715
minimum: 1.53782e-08
minimizer: [3.1415926408130987, 2.7182818258575887]
f calls: 5540
total time: 1.8034 s
stop reason: Other stopping criteria.
+============================+
The function uses Optim if it is available, however, if it is not the case, it silently reports the solution obtained in previous stages.
New sampling methods such as the:
A modified metaheuristic can extend the capability of an implemented algorithm.
The idea is to define custom components (initialization, termination criterion, etc) without overwriting default methods.
For instance:
eca = ECA(N = 100)
eca_modified = modify(eca, termination = RelativeTolerance())
optimize(f, bounds, eca_modified)
Change to return crowding
This is a totally awesome package.
I can install it using Pkg.add(url="...github...")
etc.
But I can't install it using:
julia> import Pkg; Pkg.add("Metaheuristics")
In Julia Version 1.5.3 (2020-11-09), which is a bummer, because I am trying to make another package that depends on it.
Was this momentarily a registered package? Perhaps this is somehow just on my end?
ERROR: The following package names could not be resolved:
* Metaheuristics (not found in project, manifest or registry)
Stacktrace:
[1] pkgerror(::String) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Types.jl:52
[2] ensure_resolved(::Pkg.Types.Context, ::Array{Pkg.Types.PackageSpec,1}; registry::Bool) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Types.jl:837
[3] add(::Pkg.Types.Context, ::Array{Pkg.Types.PackageSpec,1}; preserve::Pkg.Types.PreserveLevel, platform::Pkg.BinaryPlatforms.Linux, kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:177
[4] add(::Pkg.Types.Context, ::Array{Pkg.Types.PackageSpec,1}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:139
[5] #add#21 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:67 [inlined]
[6] add at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:67 [inlined]
[7] #add#20 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:66 [inlined]
[8] add at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:66 [inlined]
[9] add(::String; kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:65
[10] add(::String) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/API.jl:65
[11] top-level scope at REPL[4]:1
I am going to send a pull request that replaces Nelder-Mead local search part with Hooke-Jeeves in MCCGA algorithm.
This PR gains these items:
If it is interrupted, can you help me return to the optimal solution at that time
This issue is used to trigger TagBot; feel free to unsubscribe.
If you haven't already, you should update your TagBot.yml
to include issue comment triggers.
Please see this post on Discourse for instructions and more details.
If you'd like for me to do this for you, comment TagBot fix
on this issue.
I'll open a PR within a few hours, please be patient!
Implement Flux-inspired algorithms
Example:
# algorithm
GA = Chain(tournament(), SBX(p=0.9), PM(p=0.1), Evironmental(base=:nsga2))
# usage
P = InitialPopulation(N=100)
GA(P) # produces a new population
## optimize
for gen in 1:100
P = GA(P)
end
Hi,
I am working on an optimization problem in 3 dimensions. I have been using the Particle Swarm Optimisation and it was executing without any issues. Having a very similar syntax to PSO, I wanted to start using GA. My first attemp I obtained the following error:
MethodError: no method matching (::var"#objective_function#41")(::Vector{Float64}, ::Vector{Float64}, ::Vector{Float64})
Stacktrace:
[1] (::var"#to_optim#42"{Matrix{Float64}, var"#objective_function#41"})(theta::Vector{Float64})
@ Main \ga_optim.jl:135
[2] create_solution(x::Vector{Float64}, problem::Problem{Float64}; e::Float64)
@ Metaheuristics \packages\Metaheuristics\xnKGn\src\solutions\individual.jl:320
[3] (::Metaheuristics.var"#32#33"{Float64, Problem{Float64}, Matrix{Float64}})(i::Int64)
@ Metaheuristics .\none:0
[4] iterate
@ .\generator.jl:47 [inlined]
[5] collect(itr::Base.Generator{UnitRange{Int64}, Metaheuristics.var"#32#33"{Float64, Problem{Float64}, Matrix{Float64}}})
@ Base .\array.jl:724
[6] #generate_population#31
@ \packages\Metaheuristics\xnKGn\src\solutions\individual.jl:260 [inlined]
[7] gen_initial_
Double checking my work, I ran the code again a number of times. Confusingly, this error arises about 50% of the time, when using default GA parameters. I have discovered that using a lower population size reduces that to about 10% while increasing the population leads to almost 100%. I do not understand the error, as I am providing the right type for each function input. It must be some conversion error within the optimisation sequence. I would really appreciate any help with this.
Kind regards!
This new feature will include a posteriori decision-making.
Possible Usage Example:
julia> res = optimize(f, bounds, NSGA2())
julia> idx = decisionmaking(res, method) # or
julia> idx = decisionmaking(res.population, method) # or
julia> idx = decisionmaking(pareto_front(res), method) # or
julia> idx = method(res) # optional
decision making
will return an integer vector containing solutions in res.population
related to that decision suggested by the method
. Possible alias for decisionmaking
: mcdm
.
Consider interfacing https://github.com/jbytecode/JMcDM
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.