lagrangian- Solve lagrange multiplier problems

Safe HaskellNone




Numerically solve convex lagrange multiplier problems with conjugate gradient descent.

Here is an example from the Wikipedia page on Lagrange multipliers. Maximize f(x, y) = x + y, subject to the constraint x^2 + y^2 = 1

>>> solve 0.00001 (\[x, y] -> x + y) [\[x, y] -> x^2 + y^2 <=> 1] 2
Right ([0.707,0.707], [-0.707])

The first elements of the result pair are the arguments for the objective function at the minimum. The second elements are the lagrange multipliers.


Helper types

type AD2 s r a = AD s (AD r Double)Source

(<=>) :: ([a] -> a) -> a -> Constraint aSource

This is just a little bit of sugar for (,) to make constraints look like equals

type Constraint a = ([a] -> a, a)Source

The type for the contraints. Given a constraint g(x, y, ...) = c, we would represent it as (g, c). or with sugar g <=> c




:: Double 
-> (forall s r. (Mode s, Mode r) => [AD2 s r Double] -> AD2 s r Double)

The function to minimize

-> (forall s r. (Mode s, Mode r) => [Constraint (AD2 s r Double)])

The constraints as pairs g <=> c which represent equations of the form g(x, y, ...) = c

-> Int

The arity of the objective function which should equal the arity of the constraints.

-> Either (Result, Statistics) (Vector Double, Vector Double)

Either an explanation of why the gradient descent failed or a pair containing the arguments at the minimum and the lagrange multipliers

This is the lagrangian multiplier solver. It is assumed that the objective function and all of the constraints take in the same amount of arguments.

Experimental features

feasible :: (forall s r. (Mode s, Mode r) => [AD2 s r Double] -> AD2 s r Double) -> (forall s r. (Mode s, Mode r) => [Constraint (AD2 s r Double)]) -> [Double] -> BoolSource

WARNING. Experimental. This is not a true feasibility test for the function. I am not sure exactly how to implement that. This just checks the feasiblility at a point. If this ever returns false, solve can fail.