Giter Site home page Giter Site logo

niaorg / niapy Goto Github PK

View Code? Open in Web Editor NEW
253.0 11.0 77.0 11.96 MB

Python microframework for building nature-inspired algorithms. Official docs: https://niapy.org

Home Page: https://niapy.org

License: MIT License

Python 99.66% TeX 0.34%
python microframework nature-inspired-algorithms swarm-intelligence optimization-algorithms hacktoberfest

niapy's Issues

GWO TypeError: unsupported operand type(s)

The GWO implementation on random occasions raises the following TypeError:

algo.run(task=task)
  File ".../lib/python3.6/site-packages/NiaPy/algorithms/algorithm.py", line 348, in run
    r = self.runTask(task)
  File ".../lib/python3.6/site-packages/NiaPy/algorithms/algorithm.py", line 328, in runTask
    xb, fxb = next(algo)
  File ".../lib/python3.6/site-packages/NiaPy/algorithms/algorithm.py", line 308, in runYield
    pop, fpop, dparams = self.runIteration(task, pop, fpop, xb, fxb, **dparams)
  File ".../lib/python3.6/site-packages/NiaPy/algorithms/basic/gwo.py", line 115, in runIteration
    X3 = D - A3 * fabs(C3 * D - w)
TypeError: unsupported operand type(s) for *: 'float' and 'NoneType'

__init__() missing 3 required positional arguments: 'D', 'nFES', and 'benchmark'

Hi,

I tried your example of Flower Pollination Algorithm code, but i got an error like this :

     11 for i in range(5):
     12     task = StoppingTask(D=10, nFES=10000, benchmark=Sphere())
---> 13     algo = FlowerPollinationAlgorithm(NP=20, p=0.5)
     14     best = algo.run(task=algo)
     15     print(best)

TypeError: __init__() missing 3 required positional arguments: 'D', 'nFES', and 'benchmark'

Here's the code :

# encoding=utf8
# This is temporary fix to import module from parent folder
# It will be removed when package is published on PyPI
import sys
sys.path.append('../')
# End of fix

import random
from NiaPy.algorithms.basic import FlowerPollinationAlgorithm
from NiaPy.task import StoppingTask
from NiaPy.benchmarks import Sphere

#we will run Flower Pollination Algorithm for 5 independent runs
for i in range(5):
    task = StoppingTask(D=10, nFES=10000, benchmark=Sphere())
    algo = FlowerPollinationAlgorithm(NP=20, p=0.5)
    best = algo.run(task=algo)
    print(best)

Hope you can help, thanks!

Documentation fix

  • Line 217 is missing closing parentheses for type hint in documentation string.
  • Lines 53 and 38 should change type hint from list of Algorithms to List[Algorithms]. Same issue is in lines 54 and 39. If the one can specify algorithms and benchmarks with strings then the type hint should be Union[List[str], List[Algorithms].

Confusion with GSO

Hi. I am really confused when implementing GSO. Why the nFES and nGEN are not the same? it means fitness function is not called in every iteration, what causes that?

[JOSS] Clarify set-up requirements in README and requirements.txt

  1. It would be better if the required Python modules are clarified in your README. As of now, I can only see python 3.6+ (2.7.14) and pip. But your setup.py seems to be installing:
  • numpy <= 1.14
  • click <= 6.0
  • scipy <= 1.0.0
  • xlsxwriter <= 1.0.2
  1. In addition, having setup.py install from a requirements.txt may be more preferred that defining them individually inside the script. Check this out. Although I believe the Pipfile solves this, so this is optional.

  2. Lastly, it is also beneficial to separate "user requirements" and "dev requirements." The latter may include testing libraries you've used (pytest), and sphinx for building the documentation. You can just link this to your requirements header


This is an issue related to openjournals/joss-reviews#613

NiaPy conda dependecy problem

Steps to reproduce
Include NiaPy >= 2.0.0rc11 as a dependency to a Python package and deploy it to conda using GitHub workflow.

Expected behavior
NiaPy dependency should be found.

Actual behavior

thumbnail_image

@GregaVrbancic could you please look into it? Is the conda version outdated?
Thank you.

Best regards,
l.

Initial Update

The bot created this issue to inform you that pyup.io has been set up on this repo.
Once you have closed it, the bot will open pull requests for updates as soon as they are available.

New mechanism for stopCond and old best values

Hello,

I'm struggling about two things recently. First is that I cannot order to stop optimization if the best solution achieves a minimum/maximum value. The stop condition only depends on the function evaluation or generation count.

The other thing is that I want to see the error decreasing over iterations. Since we are not storing the best values in the algorithm, we cannot do this currently.

Is there anything that I'm not aware of, achieves these? I would like to help if these might be added to the library, but with a guidance, since I'm not sure of how big the impact would be.

Thanks as always.

xlwt is archived: consider dropping xlwt requirement?

Is your feature request related to a problem? Please describe.
The xlwt has been archived, and will no longer receive updates.
https://github.com/python-excel/xlwt

See also:
https://groups.google.com/g/python-excel/c/P6TjJgFVjMI/m/g8d0eWxTBQAJ

Upstream suggests that one should not use these utils, and instead use openpyxl.

Describe the solution you'd like
Drop xlwt based functionality, or migrate to another still maintained library.

Describe alternatives you've considered
At the moment none, we're using the latest 1.3.0 version that is needed by NiaPy, but if there are bugs there it'll be hard to fix them now with an archived xlwt.

Additional context
NA

Counting evaluations

Following algorithms may have problems with counting number of evaluations:

  • Artificial Bee Colony,
  • Differential Evolution,
  • Self Adaptive Differential Evolution.

FSS implementation

Hi @kb2623,

FSS implementation is not working. Please check it out.

I am attaching Traceback:

Traceback (most recent call last):
  File "run_fss.py", line 17, in <module>
    best = algo.run(task=task)
  File "../NiaPy/algorithms/algorithm.py", line 312, in run
    r = self.runTask(task)
  File "../NiaPy/algorithms/algorithm.py", line 292, in runTask
    xb, fxb = next(algo)
  File "../NiaPy/algorithms/algorithm.py", line 268, in runYield
    pop, fpop, dparams = self.initPopulation(task)
  File "../NiaPy/algorithms/basic/fss.py", line 237, in initPopulation
    curr_step_individual, curr_step_volitive, curr_weight_school, prev_weight_school, school = self.init_school(task)
  File "../NiaPy/algorithms/basic/fss.py", line 132, in init_school
    positions = self.generate_uniform_coordinates(task)
  File "../NiaPy/algorithms/basic/fss.py", line 110, in generate_uniform_coordinates
    for i in range(self.NP): x[i] = self.uniform(task.Lower[i], task.Upper[i], task.D)
IndexError: index 10 is out of bounds for axis 0 with size 10

How to plot the convergence?

Thank you for your work first. Could you tell me is there a method to plot the convergence with x axis is iteration numbers and y is the fitness function values?

BFOA implementation

The results of BFOA implementation are not consistent with the results found in related work. @zStupan is now working on the fix.

User defined function

activation_cnn_layer = [
'relu',
'tanh']

activation_feed_layer = [
'relu',
'tanh']

optimizers=["adam",'sgd']

map from real number [0, 1] to integer ranging [1, 3]

def swap_activation_cnn(val):
if val == 1:
return 2
return int(val * 2 + 1)

def swap_activation_feed(val):
if val == 1:
return 2
return int(val * 2 + 1)

def swap_optimzer(val):
if val == 1:
return 2
return int(val * 2 + 1)

def no_of_epoch(val):
return int(50)

map from real number [0, 1] to integer ranging [5, 15]

def no_of_batch(val):
return int(val *224 + 32)

def no_of_filters(val):
return int(val*8 + 1)

def no_neurons_hidden_layer(val):
return int(val*64+8)

def kernel_size(val):
return int(val1+10)
def dropout(val):
return int(val
1+0.1)

class cnnbenchmark():
def init(self):
self.Lower = 0
self.Upper = 1
def function(self):
# our definition of fitness function
def evaluate(D, solution):
acc_cnn_layer = activation_cnn_layer[swap_activation_cnn(solution[0]-1)]
print(acc_cnn_layer)
acc_feed_layer = activation_feed_layer[swap_activation_feed((solution[1]) - 1)]
optimizer = optimizers[(swap_optimzer(solution[2])- 1)]
epochs = no_of_epoch(solution[3])
batch=no_of_batch(solution[4])
filters=no_of_filters(solution[5])
neurons=no_neurons_hidden_layer(solution[6])
kernel_s=kernel_size(solution[7])
drop_neurons=dropout(solution[8])

      accuracy = 1 - model.model_build(acc_cnn_layer,acc_feed_layer,optimizer,epochs,batch,filters,neurons,kernel_s,drop_neurons)
      scores.append([accuracy,acc_cnn_layer,acc_feed_layer,optimizer,epochs,batch,filters,neurons,kernel_s,drop_neurons])
      return accuracy
  return evaluate

class model():
def model_build(acc_cnn_layer,acc_feed_layer,
optimizer,epochs,batch,filters,neurons,kernel_s,drop_neurons):
print("Activation cnn layer :",acc_cnn_layer)
print("Activation feed layer :",acc_feed_layer)
print("optimizer :",optimizer)
print("batch :",batch)
print("epochs :",epochs)
print("dropout :",drop_neurons)
print("no of filters :",filters)
model = Sequential()
model.add(Conv2D(64, (3,3), activation=acc_cnn_layer, input_shape=input_shape))
model.add(Conv2D(128, (3,3), activation=acc_cnn_layer))
model.add(Dropout(drop_neurons))
model.add(Flatten())
model.add(Dense(6*numPCAcomponents, activation=acc_feed_layer))
model.add(Dropout(drop_neurons))
model.add(Dense(16, activation='softmax'))

# Define optimization and train method
reduce_lr = ReduceLROnPlateau(monitor='val_acc', factor=0.9, patience=25, 
                              min_lr=0.000001, verbose=1)
checkpointer = ModelCheckpoint(filepath="checkpoint.hdf5", verbose=1, 
                              save_best_only=False)
# sgd = SGD(lr=0.001, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=optimizer, 
                              metrics=['accuracy'])
# Start to train model 
history = model.fit(X_train, y_train, 
                    batch_size=batch, 
                    epochs=epochs, 
                    verbose=1, 
                    validation_data=(X_test, y_test),
                    callbacks=[reduce_lr, checkpointer],
                    shuffle=True)

model.save('HSI_model_epochs100.h5')
model = load_model('HSI_model_epochs100.h5')
score = model.evaluate(X_test, y_test, batch_size=32)
Test_Loss = score[0]
Test_accuracy = score[1]

return Test_accuracy

I want to find the optimal hyperparameters of above deep learning model using a grey wolf optimizer. Can you please help me in this process using NiaPy library.

Documentation improvements

Documentation improvements (incomplete documentation):

Algorithms basic:

  • Genetic Algorithm
  • EvolutionStrategy1p1
    • Missing reference URL
    • Missing reference paper
  • EvolutionStrategyMp1
    • Missing reference URL
    • Missing reference paper
  • EvolutionStrategyMpL
    • Missing reference URL
    • Missing reference paper
  • EvolutionStrategyML
    • Missing reference URL
    • Missing reference paper
  • KrillHerdV11

Algorithms other:

  • SimulatedAnnealing
    • Missing reference URL
    • Missing reference paper
  • HillClimbAlgorithm
    • Missing reference URL
    • Missing reference paper

New algorithms

The following algorithms are on the wish list for the next release:

  • Harmony search

  • Simulated annealing

  • Gravitational search

  • Krill herd

  • Glowworm swarm optimization

  • Cuckoo search

Hybrid Bat Algorithm coding mistake?

Hello everyone, and thanks for the great library.

So I've been tinkering around the source code and I've came across to this:

hba.py Line 58:
if Fitness[i] <= f_new and self.rand() < self.A: Sol[i], Fitness[i] = S, f_new

This leads to accepting worse conditions. I think it should be:
if f_new <= Fitness[i] and self.rand() < self.A: Sol[i], Fitness[i] = S, f_new

Same condition on ba.py Line 80:
if (Fnew <= Fitness[i]) and (self.rand() < self.A): Sol[i], Fitness[i] = S[i], Fnew

What do you think?

In addition to this, I was checking the paper of the HBA.
I've noticed that there are also loudness and pulse emission rates for each bat. Also, they change on iteration. But here I see there is only one loudness and pulse emission rate for the population as a whole.

If you think like this as well, I can try to create a PR about it.

Thank you again!

jDE runs without stopping

Steps to reproduce

Run jDE using StoppingTask against any benchmark.

Expected behavior

It is expected, that the execution of the algorithm will terminate when the stopping condition is reached.

Actual behavior

Runs infinitely.

System configuration

OS: Linux, Mac OS (tested on those, but the issue is probably not dependent on the OS.

Python version:
Any of supported versions.

module not found error

Actual behavior


ModuleNotFoundError Traceback (most recent call last)
in ()
1 import random
2 from NiaPy.algorithms.basic import FlowerPollinationAlgorithm
----> 3 from NiaPy.task.task import StoppingTask, OptimizationType
4 from NiaPy.benchmarks import Sphere
5

ModuleNotFoundError: No module named 'NiaPy.task'

JOSS paper

Following are tasks, which should be done before submitting a paper:

  • Auto generate API docs from code comments
  • Publish auto generated docs
  • Update benchmark's comments to suite sphinx auto documentation generation
  • Refactor benchmark's utillity class
  • Documentation - add a list of used dependencies
  • Documentation - Contribution guidelines; add section on how to seek suppoprt
  • Example usage in documentation
  • Core API tests (for Algorithms)
  • Core API documentation (for Algorithms)
  • GitHub and PyPI version release
  • Write paper (see example paper)

[JOSS] Comment on existing libraries/frameworks

There are already implementations of nature-inspired algorithms/frameworks in Python. To name a few, we have DEAP, Evolopy, and Inspyred. It may be worth acknowledging some of these frameworks.

Furthermore, it may also be necessary to expound upon how NiaPy differs from other libraries, does it solve a different problem or use a different software architecture? You can leverage on the algorithms you've implemented that doesn't exist yet in other works, and comment on Niapy in the context of "white-box" or "black-box" approach on software implementation (ref1, ref2).


This is an issue related to openjournals/joss-reviews#613

BAT

Hello. I tried to use the BAT Optimization Algorithm and I expected to get approximately 0, with such benchmark functions as Rosebrock and Ackley, a number of iterations 66000. But the result I get is much higher. I wanted to know is it because of code, or by the algorithm itself?

Testing the algorithms

Most of the algorithms were written from scratch. For that reason, it would be very important that we perform more tests and code checks. More eyes can see more problems.

BAT Optimization Algorithm

Hello. I tried to use the BAT Optimization Algorithm and I expected to get approximately 0, with such benchmark functions as Rosebrock and Ackley, a number of iterations 66000. But the result I get is much higher. I wanted to know is it because of code, or by the algorithm itself?

No module named 'NiaPy.task'

Hi,

I try to install this packages into Google Colab but when i try the specific algorithm like FPA, i got an error like this.

No module named 'NiaPy.task'

this is the code that i trying :

from NiaPy.algorithms.basic import FlowerPollinationAlgorithm
from NiaPy.task import StoppingTask
from NiaPy.benchmarks import Sphere

Hope you can help, thanks!

Is is possible to manually initialize the population?

Is there a possibility to set an initial population manually or define a function which generates an individual from scratch?

I guess it would be modifying NiaPy.algorithms.Individual.generateSolution, but should only replace it for the initial solution candidates. Has anyone tried this?

My problem is binary but feasible candidates have a high 0 : 1 ratio, i suspect random initialization generates roughly the same amounts of 0 and 1 (rounded from float).

Add an example/guide showing how to solve a real-world problem

This is a feature request.
Include an example or a guide of how to do ie. feature selection using NiaPy. Not all of the potential users are familiar with the underlying algorithms and implementation details, but would still want to use this library to solve a problem (such as feature selection), using a more high-level API.
Here's a guide for feature selection from PySwarms

An example/guide could target something else also - I would just like to get an idea on how to solve a real-world problem using NiaPy, without going into details.

Algorithms checklist

Algorithms Checklist

In order to release the first stable version of the NiaPy v2.0.0, we have to go through all of the implemented algorithms to check for any inconsistencies or bugs.

The following checklist will help us monitor the progress of reviews of algorithms.

For each reviewed algorithm, would be the best if you prepare a separate pull request, for easier reviewing of submitted code, with the following checklist included in the description of PR. Please copy the following snippet, change the "<algorithm_name>" with the name of the reviewed algorithm and check every bullet point in the checklist. Also please refer to this issue in the description of PR as well.

# <algorithm_name> review

- [ ] Check parameters list (based on the reference implementation/paper)
- [ ] Check naming of the parameters (parameter names must consistent throughout the framework)
- [ ] Check documentation (are all the parameters described, is general documentation of the paper complete)
- [ ] Check function naming (is it consistent)
- [ ] Algorithm does not include and TODO's
- [ ] Check support for manual initialization of the population
- [ ] Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
- [ ] Check if a simple example of running the algorithm is present (For the structure of the examples, please follow [this one](https://github.com/NiaOrg/NiaPy/blob/master/examples/run_de.py).)

Overall Algorithms Checklist

Here is the list of all algorithm reviews:

Bat Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Firefly Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Crowding Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Aging Np Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multi Strategy Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Multi Strategy Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

multi Mutations review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Aging Np Multi Mutation Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Flower Pollination Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Grey Wolf Optimizer review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Genetic Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Artificial Bee Colony Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Particle Swarm Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Bare Bones Fireworks Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Camel Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Monkey King Evolution V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Monkey King Evolution V2 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Monkey King Evolution V3 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Evolution Strategy1p1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Evolution Strategy Mp1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Evolution Strategy Mp L review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Evolution Strategy M L review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Covariance Matrix Adaption Evolution Strategy review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Sine Cosine Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Glowworm Swarm Optimization review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Glowworm Swarm Optimization V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Glowworm Swarm Optimization V2 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Glowworm Swarm Optimization V3 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Harmony Search review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Harmony Search V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V2 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V3 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V4 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V11 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Fireworks Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Enhanced Fireworks Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dynamic Fireworks Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dynamic Fireworks Algorithm Gauss review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Gravitational Search Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Moth Flame Optimizer review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Fish School Search review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Cuckoo Search review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Coral Reefs Optimization review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Forest Optimization Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Hybrid Bat Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Differential Evolution M T S review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Differential Evolution M T Sv1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Differential Evolution M T S review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Differential Evolution M T Sv1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multi Strategy Differential Evolution M T S review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multi Strategy Differential Evolution M T Sv1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Multi Strategy Differential Evolution M T S review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Multi Strategy Differential Evolution M T Sv1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Self Adaptive Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Self Adaptive Differential Evolution Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multi Strategy Self Adaptive Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Multi Strategy Self Adaptive Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Nelder Mead Method review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Hill Climb Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Simulated Annealing review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multiple Trajectory Search review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multiple Trajectory Search V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S2 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S3 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S1v1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S3v1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Anarchic Society Optimization review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

OptimizationType.MAXIMIZATION does not work with GWO

Hi!

OptimizationType.MAXIMIZATION does not work when used, I tried it with GWO but it seems to be buggy also with other algorithms.

Steps to reproduce

requirements.txt

NiaPy==2.0.0rc5

Code - taken from here, changed only the optimization type

# encoding=utf8
# This is temporary fix to import module from parent folder
# It will be removed when package is published on PyPI
import sys
sys.path.append('../')
# End of fix

from NiaPy.algorithms.basic import GreyWolfOptimizer
from NiaPy.task import StoppingTask, OptimizationType
from NiaPy.benchmarks import Sphere

# we will run Grey Wolf Optimizer for 5 independent runs
for i in range(5):
    task = StoppingTask(D=10, nFES=10000, optType=OptimizationType.MAXIMIZATION, benchmark=Sphere())
    algo = GreyWolfOptimizer(NP=40)
    best = algo.run(task)
    print(best)

Expected behavior

It should not throw any error.

Actual behavior

TypeError                                 Traceback (most recent call last)
<ipython-input-11-bed1e1b95f2d> in <module>
     14     task = StoppingTask(D=10, nFES=10000, optType=OptimizationType.MAXIMIZATION, benchmark=Sphere())
     15     algo = GreyWolfOptimizer(NP=40)
---> 16     best = algo.run(task)
     17     print(best)

/opt/conda/lib/python3.7/site-packages/NiaPy/algorithms/algorithm.py in run(self, task)
    346         try:
    347             # task.start()
--> 348             r = self.runTask(task)
    349             return r[0], r[1] * task.optType.value
    350         except (FesException, GenException, TimeException, RefException):

/opt/conda/lib/python3.7/site-packages/NiaPy/algorithms/algorithm.py in runTask(self, task)
    326         algo, xb, fxb = self.runYield(task), None, inf
    327         while not task.stopCond():
--> 328             xb, fxb = next(algo)
    329             task.nextIter()
    330         return xb, fxb

/opt/conda/lib/python3.7/site-packages/NiaPy/algorithms/algorithm.py in runYield(self, task)
    306         yield xb, fxb
    307         while True:
--> 308             pop, fpop, dparams = self.runIteration(task, pop, fpop, xb, fxb, **dparams)
    309             xb, fxb = self.getBest(pop, fpop, xb, fxb)
    310             yield xb, fxb

/opt/conda/lib/python3.7/site-packages/NiaPy/algorithms/basic/gwo.py in runIteration(self, task, pop, fpop, xb, fxb, A, A_f, B, B_f, D, D_f, **dparams)
    109                 for i, w in enumerate(pop):
    110                         A1, C1 = 2 * a * self.rand(task.D) - a, 2 * self.rand(task.D)
--> 111                         X1 = A - A1 * fabs(C1 * A - w)
    112                         A2, C2 = 2 * a * self.rand(task.D) - a, 2 * self.rand(task.D)
    113                         X2 = B - A2 * fabs(C2 * B - w)

TypeError: unsupported operand type(s) for *: 'float' and 'NoneType'

System configuration

Using docker jupyter/scipy-notebook.

Possible issue with unit test

Steps to reproduce

Following error occurs randomly when running tests on GitHub Actions

================================== FAILURES ===================================
______________________ CSTestCase.test_custom_works_fine ______________________

self = <NiaPy.tests.test_cs.CSTestCase testMethod=test_custom_works_fine>

    def test_custom_works_fine(self):
    	cs_custom = self.algo(NP=20, seed=self.seed)
    	cs_customc = self.algo(NP=20, seed=self.seed)
>   	AlgorithmTestCase.test_algorithm_run(self, cs_custom, cs_customc, MyBenchmark())

NiaPy\tests\test_cs.py:13: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
NiaPy\tests\test_algorithm.py:325: in test_algorithm_run
    self.assertAlmostEqual(task1.x_f, x[1], msg='While running the algorithm, algorithm got better individual with fitness: %s' % task1.x_f)
E   AssertionError: While running the algorithm, algorithm got better individual with fitness: 233.3773173118443

@kb2623 since you are the most familiar with the tests, could you take a look at it?

No module named 'NiaPy.algorithms'

Steps to reproduce

I wanted implement abc in Jupyter, when I ran the following command:
'from NiaPy.algorithms.basic import ArtificialBeeColonyAlgorithm'
I got an ModuleNotFoundError: No module named 'NiaPy.algorithms'
Do you have any suggestion for me to fix it?

System configuration

OS: windows

Python version:3.7

FPA implementation

Hi @kb2623,

it seems that your FPA implementation is not working correctly:

(array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]), 0.0)
(array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]), 0.0)
(array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]), 0.0)
(array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]), 0.0)

Please check it out.

Thanks.

Logger

Logger not working, when optType set to "MAXIMIZATION".

It outputs "inf".

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.