Liking cljdoc? Tell your friends :D

fastmath.optimization

Optimization.

Namespace provides various optimization methods.

  • Brent (1d functions)
  • Bobyqa (2d+ functions)
  • Powell
  • Nelder-Mead
  • Multidirectional simplex
  • CMAES
  • Gradient
  • Bayesian Optimization (see below)

All optimizers require bounds.

Optimizers

To optimize functions call one of the following functions:

  • minimize or maximize - to perform actual optimization
  • scan-and-minimize or scan-and-maximize - functions find initial point using brute force and then perform optimization paralelly for best initialization points. Brute force scan is done using jitter low discrepancy sequence generator.

You can also create optimizer (function which performs optimization) by calling minimizer or maximizer. Optimizer accepts initial point.

All above accept:

  • one of the optimization method, ie: :brent, :bobyqa, :nelder-mead, :multidirectional-simplex, :cmaes, :gradient, :bfgs and :lbfgsb
  • function to optimize
  • parameters as a map

For parameters meaning refer Optim package

Common parameters

  • :bounds (obligatory) - search ranges for each dimensions as a seqence of [low high] pairs
  • :initial - initial point other then mid of the bounds as vector
  • :max-evals - maximum number of function evaluations
  • :max-iters - maximum number of algorithm interations
  • :bounded? - should optimizer force to keep search within bounds (some algorithms go outside desired ranges)
  • :stats? - return number of iterations and evaluations along with result
  • :rel and :abs - relative and absolute accepted errors

For scan-and-... functions additionally you can provide:

  • :N - number of brute force iterations
  • :n - fraction of N which are used as initial points to parallel optimization
  • :jitter - jitter factor for sequence generator (for scanning domain)

Specific parameters

  • BOBYQA - :number-of-points, :initial-radius, :stopping-radius
  • Nelder-Mead - :rho, :khi, :gamma, :sigma, :side-length
  • Multidirectional simples - :khi, :gamma, :side-length
  • CMAES - :check-feasable-count, :diagonal-only, :stop-fitness, :active-cma?, :population-size
  • Gradient - :bracketing-range, :formula (:polak-ribiere or :fletcher-reeves), :gradient-h (finite differentiation step, default: 0.01)

Bayesian Optimization

Bayesian optimizer can be used for optimizing expensive to evaluate black box functions. Refer this article or this article

Optimization.

Namespace provides various optimization methods.

* Brent (1d functions)
* Bobyqa (2d+ functions)
* Powell
* Nelder-Mead
* Multidirectional simplex
* CMAES
* Gradient
* Bayesian Optimization (see below)

All optimizers require bounds.

## Optimizers

To optimize functions call one of the following functions:

* [[minimize]] or [[maximize]] - to perform actual optimization
* [[scan-and-minimize]] or [[scan-and-maximize]] - functions find initial point using brute force and then perform optimization paralelly for best initialization points. Brute force scan is done using jitter low discrepancy sequence generator.

You can also create optimizer (function which performs optimization) by calling [[minimizer]] or [[maximizer]]. Optimizer accepts initial point.

All above accept:

* one of the optimization method, ie: `:brent`, `:bobyqa`, `:nelder-mead`, `:multidirectional-simplex`, `:cmaes`, `:gradient`, `:bfgs` and `:lbfgsb`
* function to optimize
* parameters as a map

For parameters meaning refer [Optim package](https://commons.apache.org/proper/commons-math/javadocs/api-3.6.1/index.html?org/apache/commons/math3/optim/package-summary.html)

### Common parameters

* `:bounds` (obligatory) - search ranges for each dimensions as a seqence of [low high] pairs
* `:initial` - initial point other then mid of the bounds as vector
* `:max-evals` - maximum number of function evaluations
* `:max-iters` - maximum number of algorithm interations
* `:bounded?` - should optimizer force to keep search within bounds (some algorithms go outside desired ranges)
* `:stats?` - return number of iterations and evaluations along with result
* `:rel` and `:abs` - relative and absolute accepted errors

For `scan-and-...` functions additionally you can provide:

* `:N` - number of brute force iterations
* `:n` - fraction of N which are used as initial points to parallel optimization
* `:jitter` - jitter factor for sequence generator (for scanning domain)

### Specific parameters

* BOBYQA - `:number-of-points`, `:initial-radius`, `:stopping-radius`
* Nelder-Mead - `:rho`, `:khi`, `:gamma`, `:sigma`, `:side-length`
* Multidirectional simples - `:khi`, `:gamma`, `:side-length`
* CMAES - `:check-feasable-count`, `:diagonal-only`, `:stop-fitness`, `:active-cma?`, `:population-size`
* Gradient - `:bracketing-range`, `:formula` (`:polak-ribiere` or `:fletcher-reeves`), `:gradient-h` (finite differentiation step, default: `0.01`) 

## Bayesian Optimization

Bayesian optimizer can be used for optimizing expensive to evaluate black box functions. Refer this [article](http://krasserm.github.io/2018/03/21/bayesian-optimization/) or this [article](https://nextjournal.com/a/LKqpdDdxiggRyHhqDG5FH?token=Ss1Qq3MzHWN8ZyEt9UC1ZZ)
raw docstring

bayesian-optimizationclj

(bayesian-optimization
  f
  {:keys [warm-up init-points bounds utility-function-type utility-param kernel
          kscale jitter noise optimizer optimizer-params normalize?]
   :or {utility-function-type :ucb
        init-points 3
        jitter 0.25
        noise 1.0E-8
        utility-param (if (#{:ei :poi} utility-function-type) 0.001 2.576)
        warm-up (* (count bounds) 1000)
        normalize? true
        kernel (k/kernel :mattern-52)
        kscale 1.0}})

Bayesian optimizer

Parameters are:

  • :warm-up - number of brute force iterations to find maximum of utility function
  • :init-points - number of initial evaluation before bayesian optimization starts. Points are selected using jittered low discrepancy sequence generator (see: [[jittered-sequence-generator]]
  • :bounds - bounds for each dimension
  • :utility-function-type - one of :ei, :poi or :ucb
  • :utility-param - parameter for utility function (kappa for ucb and xi for ei and poi)
  • :kernel - kernel, default :mattern-52, see fastmath.kernel
  • :kscale - scaling factor for kernel
  • :jitter - jitter factor for sequence generator (used to find initial points)
  • :noise - noise (lambda) factor for gaussian process
  • :optimizer - name of optimizer (used to optimized utility function)
  • :optimizer-params - optional parameters for optimizer
  • :normalize? - normalize data in gaussian process?

Returns lazy sequence with consecutive executions. Each step consist:

  • :x - maximum x
  • :y - value
  • :xs - list of all visited x's
  • :ys - list of values for every visited x
  • :gp - current gaussian process regression instance
  • :util-fn - current utility function
  • :util-best - best x in utility function
Bayesian optimizer

Parameters are:

* `:warm-up` - number of brute force iterations to find maximum of utility function
* `:init-points` - number of initial evaluation before bayesian optimization starts. Points are selected using jittered low discrepancy sequence generator (see: [[jittered-sequence-generator]]
* `:bounds` - bounds for each dimension
* `:utility-function-type` - one of `:ei`, `:poi` or `:ucb`
* `:utility-param` - parameter for utility function (kappa for `ucb` and xi for `ei` and `poi`)
* `:kernel` - kernel, default `:mattern-52`, see [[fastmath.kernel]]
* `:kscale` - scaling factor for kernel
* `:jitter` - jitter factor for sequence generator (used to find initial points)
* `:noise` - noise (lambda) factor for gaussian process
* `:optimizer` - name of optimizer (used to optimized utility function)
* `:optimizer-params` - optional parameters for optimizer
* `:normalize?` - normalize data in gaussian process?

Returns lazy sequence with consecutive executions. Each step consist:

* `:x` - maximum `x`
* `:y` - value
* `:xs` - list of all visited x's
* `:ys` - list of values for every visited x
* `:gp` - current gaussian process regression instance
* `:util-fn` - current utility function
* `:util-best` - best x in utility function
sourceraw docstring

maximizeclj

(maximize method f config)

Maximize given function.

Parameters: optimization method, function and configuration.

Maximize given function.

Parameters: optimization method, function and configuration.
sourceraw docstring

maximizerclj

(maximizer method f config)

Create optimizer which maximizer function.

Returns function which performs optimization for optionally given initial point.

Create optimizer which maximizer function.

Returns function which performs optimization for optionally given initial point.
sourceraw docstring

minimizeclj

(minimize method f config)

Minimize given function.

Parameters: optimization method, function and configuration.

Minimize given function.

Parameters: optimization method, function and configuration.
sourceraw docstring

minimizerclj

(minimizer method f config)

Create optimizer which minimizes function.

Returns function which performs optimization for optionally given initial point.

Create optimizer which minimizes function.

Returns function which performs optimization for optionally given initial point.
sourceraw docstring

scan-and-maximizeclj

source

scan-and-minimizeclj

source

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close