numeric-optimization-0.1.1.0: Unified interface to various numerical optimization algorithms
Copyright(c) Masahiro Sakai 2023
LicenseBSD-style
Maintainermasahiro.sakai@gmail.com
Stabilityprovisional
Portabilitynon-portable
Safe HaskellSafe-Inferred
LanguageHaskell2010

Numeric.Optimization

Description

This module aims to provide unified interface to various numerical optimization, like scipy.optimize in Python.

In this module, you need to explicitly provide the function to calculate the gradient, but you can use numeric-optimization-ad or numeric-optimization-backprop to define it using automatic differentiation.

Synopsis

Main function

minimize Source #

Arguments

:: forall prob. (IsProblem prob, Optionally (HasGrad prob), Optionally (HasHessian prob)) 
=> Method

Numerical optimization algorithm to use

-> Params (Vector Double)

Parameters for optimization algorithms. Use def as a default.

-> prob

Optimization problem to solve

-> Vector Double

Initial value

-> IO (Result (Vector Double)) 

Minimization of scalar function of one or more variables.

This function is intended to provide functionality similar to Python's scipy.optimize.minimize.

Example:

{-# LANGUAGE OverloadedLists #-}

import Data.Vector.Storable (Vector)
import Numeric.Optimization

main :: IO ()
main = do
  (x, result, stat) <- minimize LBFGS def (WithGrad rosenbrock rosenbrock') [-3,-4]
  print (resultSuccess result)  -- True
  print (resultSolution result)  -- [0.999999999009131,0.9999999981094296]
  print (resultValue result)  -- 1.8129771632403013e-18

-- https://en.wikipedia.org/wiki/Rosenbrock_function
rosenbrock :: Vector Double -> Double
rosenbrock [x,y] = sq (1 - x) + 100 * sq (y - sq x)

rosenbrock' :: Vector Double -> Vector Double
rosenbrock' [x,y] =
  [ 2 * (1 - x) * (-1) + 100 * 2 * (y - sq x) * (-2) * x
  , 100 * 2 * (y - sq x)
  ]

sq :: Floating a => a -> a
sq x = x ** 2

Problem specification

Problems are specified by types of IsProblem type class.

In the simplest case, Vector Double -> Double is a instance of IsProblem class. It is enough if your problem does not have constraints and the selected algorithm does not require further information (e.g. gradients and hessians),

You can equip a problem with other information using wrapper types:

If you need further flexibility or efficient implementation, you can define instance of IsProblem by yourself.

class IsProblem prob where Source #

Optimization problems

Minimal complete definition

func

Methods

func :: prob -> Vector Double -> Double Source #

Objective function

It is called fun in scipy.optimize.minimize.

bounds :: prob -> Maybe (Vector (Double, Double)) Source #

Bounds

constraints :: prob -> [Constraint] Source #

Constraints

Instances

Instances details
IsProblem prob => IsProblem (WithBounds prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => IsProblem (WithConstraints prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => IsProblem (WithGrad prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => IsProblem (WithHessian prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem (Vector Double -> Double) Source # 
Instance details

Defined in Numeric.Optimization

class IsProblem prob => HasGrad prob where Source #

Optimization problem equipped with gradient information

Minimal complete definition

grad | grad' | grad'M

Methods

grad :: prob -> Vector Double -> Vector Double Source #

Gradient of a function computed by func

It is called jac in scipy.optimize.minimize.

grad' :: prob -> Vector Double -> (Double, Vector Double) Source #

Pair of func and grad

grad'M :: PrimMonad m => prob -> Vector Double -> MVector (PrimState m) Double -> m Double Source #

Similar to grad' but destination passing style is used for gradient vector

Instances

Instances details
HasGrad prob => HasGrad (WithBounds prob) Source # 
Instance details

Defined in Numeric.Optimization

HasGrad prob => HasGrad (WithConstraints prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => HasGrad (WithGrad prob) Source # 
Instance details

Defined in Numeric.Optimization

HasGrad prob => HasGrad (WithHessian prob) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasGrad prob) => Optionally (HasGrad (WithBounds prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasGrad prob) => Optionally (HasGrad (WithConstraints prob)) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => Optionally (HasGrad (WithGrad prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasGrad prob) => Optionally (HasGrad (WithHessian prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasGrad (Vector Double -> Double)) Source # 
Instance details

Defined in Numeric.Optimization

class IsProblem prob => HasHessian prob where Source #

Optimization problem equipped with hessian information

Minimal complete definition

hessian

Methods

hessian :: prob -> Vector Double -> Matrix Double Source #

Hessian of a function computed by func

It is called hess in scipy.optimize.minimize.

hessianProduct :: prob -> Vector Double -> Vector Double -> Vector Double Source #

The product of the hessian H of a function f at x with a vector x.

It is called hessp in scipy.optimize.minimize.

See also https://hackage.haskell.org/package/ad-4.5.4/docs/Numeric-AD.html#v:hessianProduct.

Instances

Instances details
HasHessian prob => HasHessian (WithBounds prob) Source # 
Instance details

Defined in Numeric.Optimization

HasHessian prob => HasHessian (WithConstraints prob) Source # 
Instance details

Defined in Numeric.Optimization

HasHessian prob => HasHessian (WithGrad prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => HasHessian (WithHessian prob) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian prob) => Optionally (HasHessian (WithBounds prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian prob) => Optionally (HasHessian (WithConstraints prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian prob) => Optionally (HasHessian (WithGrad prob)) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => Optionally (HasHessian (WithHessian prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian (Vector Double -> Double)) Source # 
Instance details

Defined in Numeric.Optimization

data Constraint Source #

Type of constraint

Currently, no constraints are supported.

boundsUnconstrained :: Int -> Vector (Double, Double) Source #

Bounds for unconstrained problems, i.e. (-∞,+∞).

isUnconstainedBounds :: Vector (Double, Double) -> Bool Source #

Whether all lower bounds are -∞ and all upper bounds are +∞.

Wrapper types

data WithGrad prob Source #

Wrapper type for adding gradient function to a problem

Constructors

WithGrad prob (Vector Double -> Vector Double) 

Instances

Instances details
IsProblem prob => HasGrad (WithGrad prob) Source # 
Instance details

Defined in Numeric.Optimization

HasHessian prob => HasHessian (WithGrad prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => IsProblem (WithGrad prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => Optionally (HasGrad (WithGrad prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian prob) => Optionally (HasHessian (WithGrad prob)) Source # 
Instance details

Defined in Numeric.Optimization

data WithHessian prob Source #

Wrapper type for adding hessian to a problem

Constructors

WithHessian prob (Vector Double -> Matrix Double) 

Instances

Instances details
HasGrad prob => HasGrad (WithHessian prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => HasHessian (WithHessian prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => IsProblem (WithHessian prob) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasGrad prob) => Optionally (HasGrad (WithHessian prob)) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => Optionally (HasHessian (WithHessian prob)) Source # 
Instance details

Defined in Numeric.Optimization

data WithBounds prob Source #

Wrapper type for adding bounds to a problem

Constructors

WithBounds prob (Vector (Double, Double)) 

Instances

Instances details
HasGrad prob => HasGrad (WithBounds prob) Source # 
Instance details

Defined in Numeric.Optimization

HasHessian prob => HasHessian (WithBounds prob) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => IsProblem (WithBounds prob) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasGrad prob) => Optionally (HasGrad (WithBounds prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian prob) => Optionally (HasHessian (WithBounds prob)) Source # 
Instance details

Defined in Numeric.Optimization

data WithConstraints prob Source #

Wrapper type for adding constraints to a problem

Constructors

WithConstraints prob [Constraint] 

Algorithm selection

data Method Source #

Selection of numerical optimization algorithms

Constructors

CGDescent

Conjugate gradient method based on Hager and Zhang [1].

The implementation is provided by nonlinear-optimization package [3] which is a binding library of [2].

This method requires gradient but does not require hessian.

LBFGS

Limited memory BFGS (L-BFGS) algorithm [1]

The implementtion is provided by lbfgs package [2] which is a binding of liblbfgs [3].

This method requires gradient but does not require hessian.

LBFGSB

Limited memory BFGS algorithm with bound constraints (L-BFGS-B) [1][2][3]

The implementation is provided by l-bfgs-b package [4] which is a bindign to L-BFGS-B fortran code [5].

Since: 0.1.1.0

Newton

Naïve implementation of Newton method in Haskell

This method requires both gradient and hessian.

Instances

Instances details
Bounded Method Source # 
Instance details

Defined in Numeric.Optimization

Enum Method Source # 
Instance details

Defined in Numeric.Optimization

Show Method Source # 
Instance details

Defined in Numeric.Optimization

Eq Method Source # 
Instance details

Defined in Numeric.Optimization

Methods

(==) :: Method -> Method -> Bool #

(/=) :: Method -> Method -> Bool #

Ord Method Source # 
Instance details

Defined in Numeric.Optimization

isSupportedMethod :: Method -> Bool Source #

Whether a Method is supported under the current environment.

data Params a Source #

Parameters for optimization algorithms

TODO:

  • Better way to pass algorithm specific parameters?
  • Separate paramsCallback from other more concrete serializeable parameters?

Constructors

Params 

Fields

  • paramsCallback :: Maybe (a -> IO Bool)

    If callback function returns True, the algorithm execution is terminated.

  • paramsTol :: Maybe Double

    Tolerance for termination. When tol is specified, the selected algorithm sets some relevant solver-specific tolerance(s) equal to tol.

    If specified, this value is used as defaults for paramsFTol and paramsGTol.

  • paramsFTol :: Maybe Double

    LBFGS stops iteration when delta-based convergence test (see paramsPast) is enabled and the following condition is met:

    \[ \left|\frac{f' - f}{f}\right| < \mathrm{ftol}, \]

    where f' is the objective value of past (paramsPast) iterations ago, and f is the objective value of the current iteration. The default value is 1e-5.

    LBFGSB stops iteration when the following condition is met:

    \[ \frac{f^k - f^{k+1}}{\mathrm{max}\{|f^k|,|f^{k+1}|,1\}} \le \mathrm{ftol}. \]

    The default value is 1e7 * (epsilon :: Double) = 2.220446049250313e-9.

    Since: 0.1.1.0

  • paramsGTol :: Maybe Double

    LBFGSB stops iteration when \(\mathrm{max}\{|\mathrm{pg}_i| \mid i = 1, \ldots, n\} \le \mathrm{gtol}\) where \(\mathrm{pg}_i\) is the i-th component of the projected gradient.

    Since: 0.1.1.0

  • paramsMaxIters :: Maybe Int

    Maximum number of iterations.

    Currently only LBFGSB, CGDescent, and Newton uses this.

    Since: 0.1.1.0

  • paramsPast :: Maybe Int

    Distance for delta-based convergence test in LBFGS

    This parameter determines the distance, in iterations, to compute the rate of decrease of the objective function. If the value of this parameter is Nothing, the library does not perform the delta-based convergence test. The default value is Nothing.

    Since: 0.1.1.0

  • paramsMaxCorrections :: Maybe Int

    The maximum number of variable metric corrections used in LBFGSB to define the limited memory matrix.

    Since: 0.1.1.0

Instances

Instances details
Contravariant Params Source # 
Instance details

Defined in Numeric.Optimization

Methods

contramap :: (a' -> a) -> Params a -> Params a' #

(>$) :: b -> Params b -> Params a #

Default (Params a) Source # 
Instance details

Defined in Numeric.Optimization

Methods

def :: Params a #

Result

data Result a Source #

Optimization result

Constructors

Result 

Fields

Instances

Instances details
Functor Result Source # 
Instance details

Defined in Numeric.Optimization

Methods

fmap :: (a -> b) -> Result a -> Result b #

(<$) :: a -> Result b -> Result a #

Show a => Show (Result a) Source # 
Instance details

Defined in Numeric.Optimization

Methods

showsPrec :: Int -> Result a -> ShowS #

show :: Result a -> String #

showList :: [Result a] -> ShowS #

data Statistics Source #

Statistics of optimizaion process

Constructors

Statistics 

Fields

Instances

Instances details
Show Statistics Source # 
Instance details

Defined in Numeric.Optimization

Utilities and Re-export

class Default a where #

A class for types with a default value.

Minimal complete definition

Nothing

Methods

def :: a #

The default value for this type.

Instances

Instances details
Default All 
Instance details

Defined in Data.Default.Class

Methods

def :: All #

Default Any 
Instance details

Defined in Data.Default.Class

Methods

def :: Any #

Default CClock 
Instance details

Defined in Data.Default.Class

Methods

def :: CClock #

Default CDouble 
Instance details

Defined in Data.Default.Class

Methods

def :: CDouble #

Default CFloat 
Instance details

Defined in Data.Default.Class

Methods

def :: CFloat #

Default CInt 
Instance details

Defined in Data.Default.Class

Methods

def :: CInt #

Default CIntMax 
Instance details

Defined in Data.Default.Class

Methods

def :: CIntMax #

Default CIntPtr 
Instance details

Defined in Data.Default.Class

Methods

def :: CIntPtr #

Default CLLong 
Instance details

Defined in Data.Default.Class

Methods

def :: CLLong #

Default CLong 
Instance details

Defined in Data.Default.Class

Methods

def :: CLong #

Default CPtrdiff 
Instance details

Defined in Data.Default.Class

Methods

def :: CPtrdiff #

Default CSUSeconds 
Instance details

Defined in Data.Default.Class

Methods

def :: CSUSeconds #

Default CShort 
Instance details

Defined in Data.Default.Class

Methods

def :: CShort #

Default CSigAtomic 
Instance details

Defined in Data.Default.Class

Methods

def :: CSigAtomic #

Default CSize 
Instance details

Defined in Data.Default.Class

Methods

def :: CSize #

Default CTime 
Instance details

Defined in Data.Default.Class

Methods

def :: CTime #

Default CUInt 
Instance details

Defined in Data.Default.Class

Methods

def :: CUInt #

Default CUIntMax 
Instance details

Defined in Data.Default.Class

Methods

def :: CUIntMax #

Default CUIntPtr 
Instance details

Defined in Data.Default.Class

Methods

def :: CUIntPtr #

Default CULLong 
Instance details

Defined in Data.Default.Class

Methods

def :: CULLong #

Default CULong 
Instance details

Defined in Data.Default.Class

Methods

def :: CULong #

Default CUSeconds 
Instance details

Defined in Data.Default.Class

Methods

def :: CUSeconds #

Default CUShort 
Instance details

Defined in Data.Default.Class

Methods

def :: CUShort #

Default Int16 
Instance details

Defined in Data.Default.Class

Methods

def :: Int16 #

Default Int32 
Instance details

Defined in Data.Default.Class

Methods

def :: Int32 #

Default Int64 
Instance details

Defined in Data.Default.Class

Methods

def :: Int64 #

Default Int8 
Instance details

Defined in Data.Default.Class

Methods

def :: Int8 #

Default Word16 
Instance details

Defined in Data.Default.Class

Methods

def :: Word16 #

Default Word32 
Instance details

Defined in Data.Default.Class

Methods

def :: Word32 #

Default Word64 
Instance details

Defined in Data.Default.Class

Methods

def :: Word64 #

Default Word8 
Instance details

Defined in Data.Default.Class

Methods

def :: Word8 #

Default Ordering 
Instance details

Defined in Data.Default.Class

Methods

def :: Ordering #

Default Integer 
Instance details

Defined in Data.Default.Class

Methods

def :: Integer #

Default () 
Instance details

Defined in Data.Default.Class

Methods

def :: () #

Default Double 
Instance details

Defined in Data.Default.Class

Methods

def :: Double #

Default Float 
Instance details

Defined in Data.Default.Class

Methods

def :: Float #

Default Int 
Instance details

Defined in Data.Default.Class

Methods

def :: Int #

Default Word 
Instance details

Defined in Data.Default.Class

Methods

def :: Word #

(Default a, RealFloat a) => Default (Complex a) 
Instance details

Defined in Data.Default.Class

Methods

def :: Complex a #

Default (First a) 
Instance details

Defined in Data.Default.Class

Methods

def :: First a #

Default (Last a) 
Instance details

Defined in Data.Default.Class

Methods

def :: Last a #

Default a => Default (Dual a) 
Instance details

Defined in Data.Default.Class

Methods

def :: Dual a #

Default (Endo a) 
Instance details

Defined in Data.Default.Class

Methods

def :: Endo a #

Num a => Default (Product a) 
Instance details

Defined in Data.Default.Class

Methods

def :: Product a #

Num a => Default (Sum a) 
Instance details

Defined in Data.Default.Class

Methods

def :: Sum a #

Integral a => Default (Ratio a) 
Instance details

Defined in Data.Default.Class

Methods

def :: Ratio a #

Default a => Default (IO a) 
Instance details

Defined in Data.Default.Class

Methods

def :: IO a #

Default (Params a) Source # 
Instance details

Defined in Numeric.Optimization

Methods

def :: Params a #

Default (Maybe a) 
Instance details

Defined in Data.Default.Class

Methods

def :: Maybe a #

Default [a] 
Instance details

Defined in Data.Default.Class

Methods

def :: [a] #

Default r => Default (e -> r) 
Instance details

Defined in Data.Default.Class

Methods

def :: e -> r #

(Default a, Default b) => Default (a, b) 
Instance details

Defined in Data.Default.Class

Methods

def :: (a, b) #

(Default a, Default b, Default c) => Default (a, b, c) 
Instance details

Defined in Data.Default.Class

Methods

def :: (a, b, c) #

(Default a, Default b, Default c, Default d) => Default (a, b, c, d) 
Instance details

Defined in Data.Default.Class

Methods

def :: (a, b, c, d) #

(Default a, Default b, Default c, Default d, Default e) => Default (a, b, c, d, e) 
Instance details

Defined in Data.Default.Class

Methods

def :: (a, b, c, d, e) #

(Default a, Default b, Default c, Default d, Default e, Default f) => Default (a, b, c, d, e, f) 
Instance details

Defined in Data.Default.Class

Methods

def :: (a, b, c, d, e, f) #

(Default a, Default b, Default c, Default d, Default e, Default f, Default g) => Default (a, b, c, d, e, f, g) 
Instance details

Defined in Data.Default.Class

Methods

def :: (a, b, c, d, e, f, g) #

class Optionally c where Source #

Optional constraint

Instances

Instances details
Optionally (HasGrad prob) => Optionally (HasGrad (WithBounds prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasGrad prob) => Optionally (HasGrad (WithConstraints prob)) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => Optionally (HasGrad (WithGrad prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasGrad prob) => Optionally (HasGrad (WithHessian prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasGrad (Vector Double -> Double)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian prob) => Optionally (HasHessian (WithBounds prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian prob) => Optionally (HasHessian (WithConstraints prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian prob) => Optionally (HasHessian (WithGrad prob)) Source # 
Instance details

Defined in Numeric.Optimization

IsProblem prob => Optionally (HasHessian (WithHessian prob)) Source # 
Instance details

Defined in Numeric.Optimization

Optionally (HasHessian (Vector Double -> Double)) Source # 
Instance details

Defined in Numeric.Optimization

hasOptionalDict :: c => Maybe (Dict c) Source #

Utility function to define Optionally instances