Giter Site home page Giter Site logo

niaorg / niapy Goto Github PK

View Code? Open in Web Editor NEW
253.0 12.0 75.0 11.89 MB

Python microframework for building nature-inspired algorithms. Official docs: https://niapy.org

Home Page: https://niapy.org

License: MIT License

Python 99.66% TeX 0.34%
python microframework nature-inspired-algorithms swarm-intelligence optimization-algorithms hacktoberfest

niapy's Introduction

NiaPy


Check codestyle and test build PyPI Version PyPI - Status PyPI - Downloads Anaconda Badge Fedora package AUR package Packaging status Documentation Status GitHub license

GitHub commit activity Average time to resolve an issue Percentage of issues still open GitHub contributors

DOI DOI

Nature-inspired algorithms are a very popular tool for solving optimization problems. Numerous variants of nature-inspired algorithms have been developed (paper 1, paper 2) since the beginning of their era. To prove their versatility, those were tested in various domains on various applications, especially when they are hybridized, modified or adapted. However, implementation of nature-inspired algorithms is sometimes a difficult, complex and tedious task. In order to break this wall, NiaPy is intended for simple and quick use, without spending time for implementing algorithms from scratch.

Mission

Our mission is to build a collection of nature-inspired algorithms and create a simple interface for managing the optimization process. NiaPy offers:

  • numerous optimization problem implementations,
  • use of various nature-inspired algorithms without struggle and effort with a simple interface,
  • easy comparison between nature-inspired algorithms, and
  • export of results in various formats such as Pandas DataFrame, JSON or even Excel.

Installation

Install NiaPy with pip:

pip install niapy

To install NiaPy with conda, use:

conda install -c niaorg niapy

To install NiaPy on Fedora, use:

dnf install python3-niapy

To install NiaPy on Arch Linux, please use an AUR helper:

yay -Syyu python-niapy

To install NiaPy on Alpine Linux, please enable Community repository and use:

apk add py3-niapy

To install NiaPy on NixOS, please use:

nix-env -iA nixos.python310Packages.niapy

To install NiaPy on Void Linux, use:

xbps-install -S python3-niapy

Install from source

In case you want to install directly from the source code, use:

pip install git+https://github.com/NiaOrg/NiaPy.git

Algorithms

Click here for the list of implemented algorithms.

Problems

Click here for the list of implemented test problems.

Usage

After installation, you can import NiaPy as any other Python module:

$ python
>>> import niapy
>>> niapy.__version__

Let's go through a basic and advanced example.

Basic Example

Let’s say, we want to try out PSO against the Pintér problem function. Firstly, we have to create new file, with name, for example basic_example.py. Then we have to import chosen algorithm from NiaPy, so we can use it. Afterwards we initialize ParticleSwarmAlgorithm class instance and run the algorithm. Given bellow is the complete source code of basic example.

from niapy.algorithms.basic import ParticleSwarmAlgorithm
from niapy.task import Task

# we will run 10 repetitions of Weighted, velocity clamped PSO on the Pinter problem
for i in range(10):
    task = Task(problem='pinter', dimension=10, max_evals=10000)
    algorithm = ParticleSwarmAlgorithm(population_size=100, w=0.9, c1=0.5, c2=0.3, min_velocity=-1, max_velocity=1)
    best_x, best_fit = algorithm.run(task)
    print(best_fit)

Given example can be run with python basic_example.py command and should give you similar output as following:

0.008773534890863646
0.036616190934621755
186.75116812592546
0.024186452828927896
263.5697469837348
45.420706924365916
0.6946753611091367
7.756100204780568
5.839673314425907
0.06732518679742806

Advanced Example

In this example we will show you how to implement a custom problem class and use it with any of implemented algorithms. First let's create new file named advanced_example.py. As in the previous examples we wil import algorithm we want to use from niapy module.

For our custom optimization function, we have to create new class. Let's name it MyProblem. In the initialization method of MyProblem class we have to set the dimension, lower and upper bounds of the problem. Afterwards we have to override the abstract method _evaluate which takes a parameter x, the solution to be evaluated, and returns the function value. Now we should have something similar as is shown in code snippet bellow.

import numpy as np
from niapy.task import Task
from niapy.problems import Problem
from niapy.algorithms.basic import ParticleSwarmAlgorithm


# our custom problem class
class MyProblem(Problem):
    def __init__(self, dimension, lower=-10, upper=10, *args, **kwargs):
        super().__init__(dimension, lower, upper, *args, **kwargs)

    def _evaluate(self, x):
        return np.sum(x ** 2)

Now, all we have to do is to initialize our algorithm as in previous examples and pass an instance of our MyProblem class as the problem argument.

my_problem = MyProblem(dimension=20)
for i in range(10):
    task = Task(problem=my_problem, max_iters=100)
    algo = ParticleSwarmAlgorithm(population_size=100, w=0.9, c1=0.5, c2=0.3, min_velocity=-1, max_velocity=1)

    # running algorithm returns best found minimum
    best_x, best_fit = algo.run(task)
    # printing best minimum
    print(best_fit)

Now we can run our advanced example with following command: python advanced_example.py. The results should be similar to those bellow.

0.002455614050761476
0.000557652972392164
0.0029791325679865413
0.0009443595274525336
0.001012658824492069
0.0006837236892816072
0.0026789725774685495
0.005017746993004601
0.0011654473402322196
0.0019074442166293853

For more usage examples please look at examples folder.

More advanced examples can also be found in the NiaPy-examples repository.

Cite us

Are you using NiaPy in your project or research? Please cite us!

Plain format

      Vrbančič, G., Brezočnik, L., Mlakar, U., Fister, D., & Fister Jr., I. (2018).
      NiaPy: Python microframework for building nature-inspired algorithms.
      Journal of Open Source Software, 3(23), 613\. <https://doi.org/10.21105/joss.00613>

Bibtex format

    @article{NiaPyJOSS2018,
        author  = {Vrban{\v{c}}i{\v{c}}, Grega and Brezo{\v{c}}nik, Lucija
                  and Mlakar, Uro{\v{s}} and Fister, Du{\v{s}}an and {Fister Jr.}, Iztok},
        title   = {{NiaPy: Python microframework for building nature-inspired algorithms}},
        journal = {{Journal of Open Source Software}},
        year    = {2018},
        volume  = {3},
        issue   = {23},
        issn    = {2475-9066},
        doi     = {10.21105/joss.00613},
        url     = {https://doi.org/10.21105/joss.00613}
    }

RIS format

    TY  - JOUR
    T1  - NiaPy: Python microframework for building nature-inspired algorithms
    AU  - Vrbančič, Grega
    AU  - Brezočnik, Lucija
    AU  - Mlakar, Uroš
    AU  - Fister, Dušan
    AU  - Fister Jr., Iztok
    PY  - 2018
    JF  - Journal of Open Source Software
    VL  - 3
    IS  - 23
    DO  - 10.21105/joss.00613
    UR  - http://joss.theoj.org/papers/10.21105/joss.00613

Contributors ✨

Thanks goes to these wonderful people (emoji key):


Grega Vrbančič

💻 📖 🐛 💡 🚧 📦 📆 👀

firefly-cpp

💻 📖 🐛 💡 👀 💬 ⚠️ 📦

Lucija Brezočnik

💻 📖 🐛 💡

mlaky88

💻 📖 💡

rhododendrom

💻 📖 💡 🐛 👀

Klemen

💻 📖 💡 🐛 👀

Jan Popič

💻 📖 💡

Luka Pečnik

💻 📖 💡 🐛

Jan Banko

💻 📖 💡

RokPot

💻 📖 💡

mihaelmika

💻 📖 💡

Jace Browning

💻

Musa Adamu Wakili

💬

Florian Schaefer

🤔

Jan-Hendrik Menke

💬

brett18618

💬

Timotej Zaťko

🐛

sisco0

💻

zStupan

💻 🐛 📖 💡 ⚠️

Tomáš Hrnčiar

💻

Ikko Ashimine

💻

andrazperson

💻

Oromion

📦

This project follows the all-contributors specification. Contributions of any kind are welcome!

Contributing

We encourage you to contribute to NiaPy! Please check out the Contributing to NiaPy guide for guidelines about how to proceed.

Everyone interacting in NiaPy's codebases, issue trackers, chat rooms and mailing lists is expected to follow the NiaPy code of conduct.

Licence

This package is distributed under the MIT License. This license can be found online at http://www.opensource.org/licenses/MIT.

Disclaimer

This framework is provided as-is, and there are no guarantees that it fits your purposes or that it is bug-free. Use it at your own risk!

niapy's People

Contributors

alesgartner avatar aljom avatar allcontributors[bot] avatar andrazperson avatar bankojan avatar carlosal1015 avatar dependabot[bot] avatar eltociear avatar firefly-cpp avatar flyzoor avatar gregavrbancic avatar hrnciar avatar jacebrowning avatar janpopic avatar kb2623 avatar lahovniktadej avatar lucijabrezocnik avatar lukapecnik avatar matejmoravec avatar mihael-mika avatar mlaky88 avatar pyup-bot avatar rhododendrom avatar rokpot avatar sisco0 avatar zstupan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

niapy's Issues

Logger

Logger not working, when optType set to "MAXIMIZATION".

It outputs "inf".

Algorithms checklist

Algorithms Checklist

In order to release the first stable version of the NiaPy v2.0.0, we have to go through all of the implemented algorithms to check for any inconsistencies or bugs.

The following checklist will help us monitor the progress of reviews of algorithms.

For each reviewed algorithm, would be the best if you prepare a separate pull request, for easier reviewing of submitted code, with the following checklist included in the description of PR. Please copy the following snippet, change the "<algorithm_name>" with the name of the reviewed algorithm and check every bullet point in the checklist. Also please refer to this issue in the description of PR as well.

# <algorithm_name> review

- [ ] Check parameters list (based on the reference implementation/paper)
- [ ] Check naming of the parameters (parameter names must consistent throughout the framework)
- [ ] Check documentation (are all the parameters described, is general documentation of the paper complete)
- [ ] Check function naming (is it consistent)
- [ ] Algorithm does not include and TODO's
- [ ] Check support for manual initialization of the population
- [ ] Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
- [ ] Check if a simple example of running the algorithm is present (For the structure of the examples, please follow [this one](https://github.com/NiaOrg/NiaPy/blob/master/examples/run_de.py).)

Overall Algorithms Checklist

Here is the list of all algorithm reviews:

Bat Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Firefly Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Crowding Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Aging Np Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multi Strategy Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Multi Strategy Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

multi Mutations review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Aging Np Multi Mutation Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Flower Pollination Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Grey Wolf Optimizer review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Genetic Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Artificial Bee Colony Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Particle Swarm Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Bare Bones Fireworks Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Camel Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Monkey King Evolution V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Monkey King Evolution V2 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Monkey King Evolution V3 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Evolution Strategy1p1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Evolution Strategy Mp1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Evolution Strategy Mp L review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Evolution Strategy M L review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Covariance Matrix Adaption Evolution Strategy review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Sine Cosine Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Glowworm Swarm Optimization review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Glowworm Swarm Optimization V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Glowworm Swarm Optimization V2 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Glowworm Swarm Optimization V3 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Harmony Search review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Harmony Search V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V2 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V3 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V4 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Krill Herd V11 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Fireworks Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Enhanced Fireworks Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dynamic Fireworks Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dynamic Fireworks Algorithm Gauss review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Gravitational Search Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Moth Flame Optimizer review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Fish School Search review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Cuckoo Search review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Coral Reefs Optimization review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Forest Optimization Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Hybrid Bat Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Differential Evolution M T S review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Differential Evolution M T Sv1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Differential Evolution M T S review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Differential Evolution M T Sv1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multi Strategy Differential Evolution M T S review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multi Strategy Differential Evolution M T Sv1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Multi Strategy Differential Evolution M T S review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Multi Strategy Differential Evolution M T Sv1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Self Adaptive Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Self Adaptive Differential Evolution Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multi Strategy Self Adaptive Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Dyn Np Multi Strategy Self Adaptive Differential Evolution review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Nelder Mead Method review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Hill Climb Algorithm review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Simulated Annealing review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multiple Trajectory Search review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Multiple Trajectory Search V1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S2 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S3 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S1v1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

M T S_ L S3v1 review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

Anarchic Society Optimization review:

  • Check parameters list (based on the reference implementation/paper)
  • Check naming of the parameters (parameter names must consistent throughout the framework)
  • Check documentation (are all the parameters described, is general documentation of the paper complete)
  • Check function naming (is it consistent)
  • Algorithm does not include and TODO's
  • Check support for manual initialization of the population
  • Check performance of the algorithm against Sphere benchmark (does it perform as expected?)
  • Check if a simple example of running the algorithm is present (For the structure of the examples, please follow this one.)

New algorithms

The following algorithms are on the wish list for the next release:

  • Harmony search

  • Simulated annealing

  • Gravitational search

  • Krill herd

  • Glowworm swarm optimization

  • Cuckoo search

Is is possible to manually initialize the population?

Is there a possibility to set an initial population manually or define a function which generates an individual from scratch?

I guess it would be modifying NiaPy.algorithms.Individual.generateSolution, but should only replace it for the initial solution candidates. Has anyone tried this?

My problem is binary but feasible candidates have a high 0 : 1 ratio, i suspect random initialization generates roughly the same amounts of 0 and 1 (rounded from float).

[JOSS] Comment on existing libraries/frameworks

There are already implementations of nature-inspired algorithms/frameworks in Python. To name a few, we have DEAP, Evolopy, and Inspyred. It may be worth acknowledging some of these frameworks.

Furthermore, it may also be necessary to expound upon how NiaPy differs from other libraries, does it solve a different problem or use a different software architecture? You can leverage on the algorithms you've implemented that doesn't exist yet in other works, and comment on Niapy in the context of "white-box" or "black-box" approach on software implementation (ref1, ref2).


This is an issue related to openjournals/joss-reviews#613

No module named 'NiaPy.task'

Hi,

I try to install this packages into Google Colab but when i try the specific algorithm like FPA, i got an error like this.

No module named 'NiaPy.task'

this is the code that i trying :

from NiaPy.algorithms.basic import FlowerPollinationAlgorithm
from NiaPy.task import StoppingTask
from NiaPy.benchmarks import Sphere

Hope you can help, thanks!

Documentation improvements

Documentation improvements (incomplete documentation):

Algorithms basic:

  • Genetic Algorithm
  • EvolutionStrategy1p1
    • Missing reference URL
    • Missing reference paper
  • EvolutionStrategyMp1
    • Missing reference URL
    • Missing reference paper
  • EvolutionStrategyMpL
    • Missing reference URL
    • Missing reference paper
  • EvolutionStrategyML
    • Missing reference URL
    • Missing reference paper
  • KrillHerdV11

Algorithms other:

  • SimulatedAnnealing
    • Missing reference URL
    • Missing reference paper
  • HillClimbAlgorithm
    • Missing reference URL
    • Missing reference paper

Initial Update

The bot created this issue to inform you that pyup.io has been set up on this repo.
Once you have closed it, the bot will open pull requests for updates as soon as they are available.

Documentation fix

  • Line 217 is missing closing parentheses for type hint in documentation string.
  • Lines 53 and 38 should change type hint from list of Algorithms to List[Algorithms]. Same issue is in lines 54 and 39. If the one can specify algorithms and benchmarks with strings then the type hint should be Union[List[str], List[Algorithms].

JOSS paper

Following are tasks, which should be done before submitting a paper:

  • Auto generate API docs from code comments
  • Publish auto generated docs
  • Update benchmark's comments to suite sphinx auto documentation generation
  • Refactor benchmark's utillity class
  • Documentation - add a list of used dependencies
  • Documentation - Contribution guidelines; add section on how to seek suppoprt
  • Example usage in documentation
  • Core API tests (for Algorithms)
  • Core API documentation (for Algorithms)
  • GitHub and PyPI version release
  • Write paper (see example paper)

OptimizationType.MAXIMIZATION does not work with GWO

Hi!

OptimizationType.MAXIMIZATION does not work when used, I tried it with GWO but it seems to be buggy also with other algorithms.

Steps to reproduce

requirements.txt

NiaPy==2.0.0rc5

Code - taken from here, changed only the optimization type

# encoding=utf8
# This is temporary fix to import module from parent folder
# It will be removed when package is published on PyPI
import sys
sys.path.append('../')
# End of fix

from NiaPy.algorithms.basic import GreyWolfOptimizer
from NiaPy.task import StoppingTask, OptimizationType
from NiaPy.benchmarks import Sphere

# we will run Grey Wolf Optimizer for 5 independent runs
for i in range(5):
    task = StoppingTask(D=10, nFES=10000, optType=OptimizationType.MAXIMIZATION, benchmark=Sphere())
    algo = GreyWolfOptimizer(NP=40)
    best = algo.run(task)
    print(best)

Expected behavior

It should not throw any error.

Actual behavior

TypeError                                 Traceback (most recent call last)
<ipython-input-11-bed1e1b95f2d> in <module>
     14     task = StoppingTask(D=10, nFES=10000, optType=OptimizationType.MAXIMIZATION, benchmark=Sphere())
     15     algo = GreyWolfOptimizer(NP=40)
---> 16     best = algo.run(task)
     17     print(best)

/opt/conda/lib/python3.7/site-packages/NiaPy/algorithms/algorithm.py in run(self, task)
    346         try:
    347             # task.start()
--> 348             r = self.runTask(task)
    349             return r[0], r[1] * task.optType.value
    350         except (FesException, GenException, TimeException, RefException):

/opt/conda/lib/python3.7/site-packages/NiaPy/algorithms/algorithm.py in runTask(self, task)
    326         algo, xb, fxb = self.runYield(task), None, inf
    327         while not task.stopCond():
--> 328             xb, fxb = next(algo)
    329             task.nextIter()
    330         return xb, fxb

/opt/conda/lib/python3.7/site-packages/NiaPy/algorithms/algorithm.py in runYield(self, task)
    306         yield xb, fxb
    307         while True:
--> 308             pop, fpop, dparams = self.runIteration(task, pop, fpop, xb, fxb, **dparams)
    309             xb, fxb = self.getBest(pop, fpop, xb, fxb)
    310             yield xb, fxb

/opt/conda/lib/python3.7/site-packages/NiaPy/algorithms/basic/gwo.py in runIteration(self, task, pop, fpop, xb, fxb, A, A_f, B, B_f, D, D_f, **dparams)
    109                 for i, w in enumerate(pop):
    110                         A1, C1 = 2 * a * self.rand(task.D) - a, 2 * self.rand(task.D)
--> 111                         X1 = A - A1 * fabs(C1 * A - w)
    112                         A2, C2 = 2 * a * self.rand(task.D) - a, 2 * self.rand(task.D)
    113                         X2 = B - A2 * fabs(C2 * B - w)

TypeError: unsupported operand type(s) for *: 'float' and 'NoneType'

System configuration

Using docker jupyter/scipy-notebook.

Testing the algorithms

Most of the algorithms were written from scratch. For that reason, it would be very important that we perform more tests and code checks. More eyes can see more problems.

How to plot the convergence?

Thank you for your work first. Could you tell me is there a method to plot the convergence with x axis is iteration numbers and y is the fitness function values?

Add an example/guide showing how to solve a real-world problem

This is a feature request.
Include an example or a guide of how to do ie. feature selection using NiaPy. Not all of the potential users are familiar with the underlying algorithms and implementation details, but would still want to use this library to solve a problem (such as feature selection), using a more high-level API.
Here's a guide for feature selection from PySwarms

An example/guide could target something else also - I would just like to get an idea on how to solve a real-world problem using NiaPy, without going into details.

No module named 'NiaPy.algorithms'

Steps to reproduce

I wanted implement abc in Jupyter, when I ran the following command:
'from NiaPy.algorithms.basic import ArtificialBeeColonyAlgorithm'
I got an ModuleNotFoundError: No module named 'NiaPy.algorithms'
Do you have any suggestion for me to fix it?

System configuration

OS: windows

Python version:3.7

NiaPy conda dependecy problem

Steps to reproduce
Include NiaPy >= 2.0.0rc11 as a dependency to a Python package and deploy it to conda using GitHub workflow.

Expected behavior
NiaPy dependency should be found.

Actual behavior

thumbnail_image

@GregaVrbancic could you please look into it? Is the conda version outdated?
Thank you.

Best regards,
l.

BAT

Hello. I tried to use the BAT Optimization Algorithm and I expected to get approximately 0, with such benchmark functions as Rosebrock and Ackley, a number of iterations 66000. But the result I get is much higher. I wanted to know is it because of code, or by the algorithm itself?

BFOA implementation

The results of BFOA implementation are not consistent with the results found in related work. @zStupan is now working on the fix.

BAT Optimization Algorithm

Hello. I tried to use the BAT Optimization Algorithm and I expected to get approximately 0, with such benchmark functions as Rosebrock and Ackley, a number of iterations 66000. But the result I get is much higher. I wanted to know is it because of code, or by the algorithm itself?

xlwt is archived: consider dropping xlwt requirement?

Is your feature request related to a problem? Please describe.
The xlwt has been archived, and will no longer receive updates.
https://github.com/python-excel/xlwt

See also:
https://groups.google.com/g/python-excel/c/P6TjJgFVjMI/m/g8d0eWxTBQAJ

Upstream suggests that one should not use these utils, and instead use openpyxl.

Describe the solution you'd like
Drop xlwt based functionality, or migrate to another still maintained library.

Describe alternatives you've considered
At the moment none, we're using the latest 1.3.0 version that is needed by NiaPy, but if there are bugs there it'll be hard to fix them now with an archived xlwt.

Additional context
NA

Hybrid Bat Algorithm coding mistake?

Hello everyone, and thanks for the great library.

So I've been tinkering around the source code and I've came across to this:

hba.py Line 58:
if Fitness[i] <= f_new and self.rand() < self.A: Sol[i], Fitness[i] = S, f_new

This leads to accepting worse conditions. I think it should be:
if f_new <= Fitness[i] and self.rand() < self.A: Sol[i], Fitness[i] = S, f_new

Same condition on ba.py Line 80:
if (Fnew <= Fitness[i]) and (self.rand() < self.A): Sol[i], Fitness[i] = S[i], Fnew

What do you think?

In addition to this, I was checking the paper of the HBA.
I've noticed that there are also loudness and pulse emission rates for each bat. Also, they change on iteration. But here I see there is only one loudness and pulse emission rate for the population as a whole.

If you think like this as well, I can try to create a PR about it.

Thank you again!

FPA implementation

Hi @kb2623,

it seems that your FPA implementation is not working correctly:

(array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]), 0.0)
(array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]), 0.0)
(array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]), 0.0)
(array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]), 0.0)

Please check it out.

Thanks.

module not found error

Actual behavior


ModuleNotFoundError Traceback (most recent call last)
in ()
1 import random
2 from NiaPy.algorithms.basic import FlowerPollinationAlgorithm
----> 3 from NiaPy.task.task import StoppingTask, OptimizationType
4 from NiaPy.benchmarks import Sphere
5

ModuleNotFoundError: No module named 'NiaPy.task'

Possible issue with unit test

Steps to reproduce

Following error occurs randomly when running tests on GitHub Actions

================================== FAILURES ===================================
______________________ CSTestCase.test_custom_works_fine ______________________

self = <NiaPy.tests.test_cs.CSTestCase testMethod=test_custom_works_fine>

    def test_custom_works_fine(self):
    	cs_custom = self.algo(NP=20, seed=self.seed)
    	cs_customc = self.algo(NP=20, seed=self.seed)
>   	AlgorithmTestCase.test_algorithm_run(self, cs_custom, cs_customc, MyBenchmark())

NiaPy\tests\test_cs.py:13: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
NiaPy\tests\test_algorithm.py:325: in test_algorithm_run
    self.assertAlmostEqual(task1.x_f, x[1], msg='While running the algorithm, algorithm got better individual with fitness: %s' % task1.x_f)
E   AssertionError: While running the algorithm, algorithm got better individual with fitness: 233.3773173118443

@kb2623 since you are the most familiar with the tests, could you take a look at it?

[JOSS] Clarify set-up requirements in README and requirements.txt

  1. It would be better if the required Python modules are clarified in your README. As of now, I can only see python 3.6+ (2.7.14) and pip. But your setup.py seems to be installing:
  • numpy <= 1.14
  • click <= 6.0
  • scipy <= 1.0.0
  • xlsxwriter <= 1.0.2
  1. In addition, having setup.py install from a requirements.txt may be more preferred that defining them individually inside the script. Check this out. Although I believe the Pipfile solves this, so this is optional.

  2. Lastly, it is also beneficial to separate "user requirements" and "dev requirements." The latter may include testing libraries you've used (pytest), and sphinx for building the documentation. You can just link this to your requirements header


This is an issue related to openjournals/joss-reviews#613

Confusion with GSO

Hi. I am really confused when implementing GSO. Why the nFES and nGEN are not the same? it means fitness function is not called in every iteration, what causes that?

FSS implementation

Hi @kb2623,

FSS implementation is not working. Please check it out.

I am attaching Traceback:

Traceback (most recent call last):
  File "run_fss.py", line 17, in <module>
    best = algo.run(task=task)
  File "../NiaPy/algorithms/algorithm.py", line 312, in run
    r = self.runTask(task)
  File "../NiaPy/algorithms/algorithm.py", line 292, in runTask
    xb, fxb = next(algo)
  File "../NiaPy/algorithms/algorithm.py", line 268, in runYield
    pop, fpop, dparams = self.initPopulation(task)
  File "../NiaPy/algorithms/basic/fss.py", line 237, in initPopulation
    curr_step_individual, curr_step_volitive, curr_weight_school, prev_weight_school, school = self.init_school(task)
  File "../NiaPy/algorithms/basic/fss.py", line 132, in init_school
    positions = self.generate_uniform_coordinates(task)
  File "../NiaPy/algorithms/basic/fss.py", line 110, in generate_uniform_coordinates
    for i in range(self.NP): x[i] = self.uniform(task.Lower[i], task.Upper[i], task.D)
IndexError: index 10 is out of bounds for axis 0 with size 10

GWO TypeError: unsupported operand type(s)

The GWO implementation on random occasions raises the following TypeError:

algo.run(task=task)
  File ".../lib/python3.6/site-packages/NiaPy/algorithms/algorithm.py", line 348, in run
    r = self.runTask(task)
  File ".../lib/python3.6/site-packages/NiaPy/algorithms/algorithm.py", line 328, in runTask
    xb, fxb = next(algo)
  File ".../lib/python3.6/site-packages/NiaPy/algorithms/algorithm.py", line 308, in runYield
    pop, fpop, dparams = self.runIteration(task, pop, fpop, xb, fxb, **dparams)
  File ".../lib/python3.6/site-packages/NiaPy/algorithms/basic/gwo.py", line 115, in runIteration
    X3 = D - A3 * fabs(C3 * D - w)
TypeError: unsupported operand type(s) for *: 'float' and 'NoneType'

New mechanism for stopCond and old best values

Hello,

I'm struggling about two things recently. First is that I cannot order to stop optimization if the best solution achieves a minimum/maximum value. The stop condition only depends on the function evaluation or generation count.

The other thing is that I want to see the error decreasing over iterations. Since we are not storing the best values in the algorithm, we cannot do this currently.

Is there anything that I'm not aware of, achieves these? I would like to help if these might be added to the library, but with a guidance, since I'm not sure of how big the impact would be.

Thanks as always.

User defined function

activation_cnn_layer = [
'relu',
'tanh']

activation_feed_layer = [
'relu',
'tanh']

optimizers=["adam",'sgd']

map from real number [0, 1] to integer ranging [1, 3]

def swap_activation_cnn(val):
if val == 1:
return 2
return int(val * 2 + 1)

def swap_activation_feed(val):
if val == 1:
return 2
return int(val * 2 + 1)

def swap_optimzer(val):
if val == 1:
return 2
return int(val * 2 + 1)

def no_of_epoch(val):
return int(50)

map from real number [0, 1] to integer ranging [5, 15]

def no_of_batch(val):
return int(val *224 + 32)

def no_of_filters(val):
return int(val*8 + 1)

def no_neurons_hidden_layer(val):
return int(val*64+8)

def kernel_size(val):
return int(val1+10)
def dropout(val):
return int(val
1+0.1)

class cnnbenchmark():
def init(self):
self.Lower = 0
self.Upper = 1
def function(self):
# our definition of fitness function
def evaluate(D, solution):
acc_cnn_layer = activation_cnn_layer[swap_activation_cnn(solution[0]-1)]
print(acc_cnn_layer)
acc_feed_layer = activation_feed_layer[swap_activation_feed((solution[1]) - 1)]
optimizer = optimizers[(swap_optimzer(solution[2])- 1)]
epochs = no_of_epoch(solution[3])
batch=no_of_batch(solution[4])
filters=no_of_filters(solution[5])
neurons=no_neurons_hidden_layer(solution[6])
kernel_s=kernel_size(solution[7])
drop_neurons=dropout(solution[8])

      accuracy = 1 - model.model_build(acc_cnn_layer,acc_feed_layer,optimizer,epochs,batch,filters,neurons,kernel_s,drop_neurons)
      scores.append([accuracy,acc_cnn_layer,acc_feed_layer,optimizer,epochs,batch,filters,neurons,kernel_s,drop_neurons])
      return accuracy
  return evaluate

class model():
def model_build(acc_cnn_layer,acc_feed_layer,
optimizer,epochs,batch,filters,neurons,kernel_s,drop_neurons):
print("Activation cnn layer :",acc_cnn_layer)
print("Activation feed layer :",acc_feed_layer)
print("optimizer :",optimizer)
print("batch :",batch)
print("epochs :",epochs)
print("dropout :",drop_neurons)
print("no of filters :",filters)
model = Sequential()
model.add(Conv2D(64, (3,3), activation=acc_cnn_layer, input_shape=input_shape))
model.add(Conv2D(128, (3,3), activation=acc_cnn_layer))
model.add(Dropout(drop_neurons))
model.add(Flatten())
model.add(Dense(6*numPCAcomponents, activation=acc_feed_layer))
model.add(Dropout(drop_neurons))
model.add(Dense(16, activation='softmax'))

# Define optimization and train method
reduce_lr = ReduceLROnPlateau(monitor='val_acc', factor=0.9, patience=25, 
                              min_lr=0.000001, verbose=1)
checkpointer = ModelCheckpoint(filepath="checkpoint.hdf5", verbose=1, 
                              save_best_only=False)
# sgd = SGD(lr=0.001, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=optimizer, 
                              metrics=['accuracy'])
# Start to train model 
history = model.fit(X_train, y_train, 
                    batch_size=batch, 
                    epochs=epochs, 
                    verbose=1, 
                    validation_data=(X_test, y_test),
                    callbacks=[reduce_lr, checkpointer],
                    shuffle=True)

model.save('HSI_model_epochs100.h5')
model = load_model('HSI_model_epochs100.h5')
score = model.evaluate(X_test, y_test, batch_size=32)
Test_Loss = score[0]
Test_accuracy = score[1]

return Test_accuracy

I want to find the optimal hyperparameters of above deep learning model using a grey wolf optimizer. Can you please help me in this process using NiaPy library.

Counting evaluations

Following algorithms may have problems with counting number of evaluations:

  • Artificial Bee Colony,
  • Differential Evolution,
  • Self Adaptive Differential Evolution.

jDE runs without stopping

Steps to reproduce

Run jDE using StoppingTask against any benchmark.

Expected behavior

It is expected, that the execution of the algorithm will terminate when the stopping condition is reached.

Actual behavior

Runs infinitely.

System configuration

OS: Linux, Mac OS (tested on those, but the issue is probably not dependent on the OS.

Python version:
Any of supported versions.

__init__() missing 3 required positional arguments: 'D', 'nFES', and 'benchmark'

Hi,

I tried your example of Flower Pollination Algorithm code, but i got an error like this :

     11 for i in range(5):
     12     task = StoppingTask(D=10, nFES=10000, benchmark=Sphere())
---> 13     algo = FlowerPollinationAlgorithm(NP=20, p=0.5)
     14     best = algo.run(task=algo)
     15     print(best)

TypeError: __init__() missing 3 required positional arguments: 'D', 'nFES', and 'benchmark'

Here's the code :

# encoding=utf8
# This is temporary fix to import module from parent folder
# It will be removed when package is published on PyPI
import sys
sys.path.append('../')
# End of fix

import random
from NiaPy.algorithms.basic import FlowerPollinationAlgorithm
from NiaPy.task import StoppingTask
from NiaPy.benchmarks import Sphere

#we will run Flower Pollination Algorithm for 5 independent runs
for i in range(5):
    task = StoppingTask(D=10, nFES=10000, benchmark=Sphere())
    algo = FlowerPollinationAlgorithm(NP=20, p=0.5)
    best = algo.run(task=algo)
    print(best)

Hope you can help, thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.