Giter Site home page Giter Site logo

shkalikovoleh / optalg Goto Github PK

View Code? Open in Web Editor NEW
0.0 3.0 3.0 7.12 MB

Reimplementation of optimization algorithms.

License: MIT License

Python 100.00%
optimization-algorithms university gradient-descent nelder-mead hooke-jeeves conjugate-gradient-descent bfgs dfp clonalg penalty

optalg's Introduction

OptAlg

Set of optimization algorithms.

Usage

from optalg.<subpackage> import <algo-name>

def f(x):
  return <your-function value>(where argument x_i = x[i])

optimizer = <algo-name>(params...)
res = optimizer.optimize(f, [init_state]) #optimization result

For methods requiring gradient and hessian calculations, use autograd.numpy instead of numpy for define objective function. For example:

import numpy as np
from autograd.numpy import sin
from optalg.line_search import ArmijoBacktracking
from optalg.unconstrained.descent import GradientDescent
from optalg.stop_criteria import GradientNormCriterion


def f(x):
  return x[0]**2 + sin(x[1]**2)


gnCriterion = GradientNormCriterion(10**-3)
step_opt = ArmijoBacktracking(1, 0.5)
optimizer = GradientDescent(gnCriterion, step_opt)

res = optimizer.optimize(f, np.array([-3, 1]))
res.x #optimum

Unconstrained algorithms

Gradient free

Methods that does not require differentiability of the objective function. For direction calculation uses another search methods.

  • Nelder-Mead - simplex reflection, contraction, expansion and shrink to the minimum of the objective function.
  • Hooke-Jeeves - pattern search. Descent direction is the best combination of coordinates of the pertubation vector.

Gradient

Methods based on descent to minimum by gradient-like direction.

  • Gradient descent - simple gradient descent

  • Cojugate gradients descent - descent direction is the sum of gradient in current point and the weighted direction from the previous iteration. Avaliable variations:

    • Fletcher-Reeves
    • Polak-Ribiere
    • Hestenes-Stiefel
    • Dai-Yuan

Newton

Second-order descent algorithms

  • Newton - descent direction is dot product of the hessian and gradient.

  • Quasi Newton - inverse hessian replaces with an approximate value

    • BFGS - Broyden-Fletcher-Goldfarb-Shanno
    • DFP - Davidon-Fletcher-Powell
    • SR1 - Symmetric Rank 1 method
    • Broyden - Broyden's method

Evolutional

Methods based on natural evolution. On each iteration methods select "best" individual from population, reproduce new generation and replace previous individuals.

Immune

Artificial immune system

  • ClonAlg - clonal selection algorithm

Constrained algorithms

Penalty

Set of method for constrained optimization that use penalty function for representing constraints

  • Penalty - penalty function for external point
  • Interior - barrier function for interior point
  • Augmented Lagragian - modified Lagrangian + inequality constraints

Line search

On each step descent direction multiplies by step size. Avaliable descent's step size calculation methods:

  • FixedStep - constant step size
  • ArmijoBacktracking - step dividing if the function value at the new point does not satisfy armijo condition.
  • BisectionWolfe - bisection method that either computes a step size satisfying the weak Wolfe conditions or sends the function values to -inf.

optalg's People

Contributors

anastorr avatar mkrooted256 avatar shkalikovoleh avatar

Watchers

 avatar  avatar  avatar

optalg's Issues

Can not get empty history

If we call get_last_history() for gradient descent methods, that have empty history, we catch an error.

Add ClonAlg

Add immune ClonAlg optimization method.

  1. Create immune subpackage(folder and init)
  2. Add base immune class(if you can select general parameters). This class must inherit OptimizerWithHistory.
  3. Add CloneAlg implementation
  4. Check history correctness
  5. Add unittests(in tests/immune).

For run test use
python3 -m unittest discover from main directory.

Add penalty method

Add constrained optimization method that on each iteration minimize with unconstrained optimization objective function plus some penalty which represent constarints.
Variations:

  • Internal point
  • External point

Add Lagrange method

Add modified Lagrange method that can optimize function with equality and inequality constraints.
Literature:

  • Пантелеев Методы оптимизации стр. 275

Fix stop criteria argument

In GD with step decrease and fastest stop criteria previous point argument is incorrect because last element in history is current point. We must use -2 index for select previous point for correct stop criteria work.

Add gradient descent with step decrease

Add gradient descent algorithm with step decrease via divide learning rate into fixed value if function value in new point greater than in previous point.

Add Genetic algorithm

  • Add classic genetic algorithm.
  • Move immune package in genetic and create base classes.

Add PSO

Add particle swarm optimization algorithm(with abstractions).
Wiki

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.