Safe Haskell | None |
---|

Numerically solve convex lagrange multiplier problems with conjugate gradient descent.

Here is an example from the Wikipedia page on Lagrange multipliers. Maximize f(x, y) = x + y, subject to the constraint x^2 + y^2 = 1

`>>>`

Right ([0.707,0.707], [-0.707])`maximize 0.00001 (\[x, y] -> x + y) [(\[x, y] -> x^2 + y^2) <=> 1] 2`

The first elements of the result pair are the arguments for the objective function at the minimum. The second elements are the lagrange multipliers.

- type AD2 s r a = AD s (AD r Double)
- (<=>) :: ([a] -> a) -> a -> Constraint a
- type Constraint a = ([a] -> a, a)
- maximize :: Double -> (forall s r. (Mode s, Mode r) => [AD2 s r Double] -> AD2 s r Double) -> (forall s r. (Mode s, Mode r) => [Constraint (AD2 s r Double)]) -> Int -> Either (Result, Statistics) (Vector Double, Vector Double)
- minimize :: Double -> (forall s r. (Mode s, Mode r) => [AD2 s r Double] -> AD2 s r Double) -> (forall s r. (Mode s, Mode r) => [Constraint (AD2 s r Double)]) -> Int -> Either (Result, Statistics) (Vector Double, Vector Double)
- feasible :: (forall s r. (Mode s, Mode r) => [AD2 s r Double] -> AD2 s r Double) -> (forall s r. (Mode s, Mode r) => [Constraint (AD2 s r Double)]) -> [Double] -> Bool

### Helper types

(<=>) :: ([a] -> a) -> a -> Constraint aSource

This is just a little bit of sugar for (,) to make constraints look like equals

type Constraint a = ([a] -> a, a)Source

The type for the contraints.
Given a constraint `g(x, y, ...) = c`

, we would represent it as `(g, c)`

.
or with sugar `g`

`<=>`

`c`

## Solver

:: Double | |

-> (forall s r. (Mode s, Mode r) => [AD2 s r Double] -> AD2 s r Double) | The function to maximize |

-> (forall s r. (Mode s, Mode r) => [Constraint (AD2 s r Double)]) | The constraints as pairs |

-> Int | The arity of the objective function which should equal the arity of the constraints. |

-> Either (Result, Statistics) (Vector Double, Vector Double) | Either an explanation of why the gradient descent failed or a pair containing the arguments at the minimum and the lagrange multipliers |

Finding the maximum is the same as the minimum with the objective function inverted

:: Double | |

-> (forall s r. (Mode s, Mode r) => [AD2 s r Double] -> AD2 s r Double) | The function to minimize |

-> (forall s r. (Mode s, Mode r) => [Constraint (AD2 s r Double)]) | The constraints as pairs |

-> Int | The arity of the objective function which should equal the arity of the constraints. |

-> Either (Result, Statistics) (Vector Double, Vector Double) | Either an explanation of why the gradient descent failed or a pair containing the arguments at the minimum and the lagrange multipliers |

This is the lagrangian multiplier solver. It is assumed that the objective function and all of the constraints take in the same amount of arguments.

### Experimental features

feasible :: (forall s r. (Mode s, Mode r) => [AD2 s r Double] -> AD2 s r Double) -> (forall s r. (Mode s, Mode r) => [Constraint (AD2 s r Double)]) -> [Double] -> BoolSource

WARNING. Experimental.
This is not a true feasibility test for the function. I am not sure
exactly how to implement that. This just checks the feasiblility at a point.
If this ever returns false, `solve`

can fail.