eigen-1.0.0: Haskel binding for Eigen library

Safe HaskellNone




The problem: You have a system of equations, that you have written as a single matrix equation

Ax = b

Where A and b are matrices (b could be a vector, as a special case). You want to find a solution x.

The solution: You can choose between various decompositions, depending on what your matrix A looks like, and depending on whether you favor speed or accuracy. However, let's start with an example that works in all cases, and is a good compromise:

import Data.Eigen.Matrix
import Data.Eigen.LA

main = do
        a = fromList [[1,2,3], [4,5,6], [7,8,10]]
        b = fromList [[3],[3],[4]]
        x = solve ColPivHouseholderQR a b
    putStrLn "Here is the matrix A:" >> print a
    putStrLn "Here is the vector b:" >> print b
    putStrLn "The solution is:" >> print x

produces the following output

Here is the matrix A:
Matrix 3x3
1.0 2.0 3.0
4.0 5.0 6.0
7.0 8.0 10.0

Here is the vector b:
Matrix 3x1

The solution is:
Matrix 3x1

Checking if a solution really exists: Only you know what error margin you want to allow for a solution to be considered valid.

You can compute relative error using norm (ax - b) / norm b formula or use relativeError function which provides the same calculation implemented slightly more efficient.


Basic linear solving

data Decomposition Source

Decomposition           Requirements on the matrix          Speed   Accuracy

PartialPivLU            Invertible                          ++      +
FullPivLU               None                                -       +++
HouseholderQR           None                                ++      +
ColPivHouseholderQR     None                                +       ++
FullPivHouseholderQR    None                                -       +++
LLT                     Positive definite                   +++     +
LDLT                    Positive or negative semidefinite   +++     ++
JacobiSVD               None                                -       +++

The best way to do least squares solving for square matrices is with a SVD decomposition (JacobiSVD)



LU decomposition of a matrix with partial pivoting.


LU decomposition of a matrix with complete pivoting.


Householder QR decomposition of a matrix.


Householder rank-revealing QR decomposition of a matrix with column-pivoting.


Householder rank-revealing QR decomposition of a matrix with full pivoting.


Standard Cholesky decomposition (LL^T) of a matrix.


Robust Cholesky decomposition of a matrix with pivoting.


Two-sided Jacobi SVD decomposition of a rectangular matrix.

solve :: Decomposition -> Matrix -> Matrix -> MatrixSource

x = solve d a b
finds a solution x of ax = b equation using decomposition d

relativeError :: Matrix -> Matrix -> Matrix -> DoubleSource

e = relativeError x a b
computes norm (ax - b) / norm b where norm is L2 norm

Multiple linear regression

linearRegression :: [[Double]] -> ([Double], Double)Source

(coeffs, error) = linearRegression points
computes multiple linear regression y = a1 x1 + a2 x2 + ... + an xn + b using ColPivHouseholderQR decomposition
  • point format is [y, x1..xn]
  • coeffs format is [b, a1..an]
  • error is calculated using relativeError
import Data.Eigen.LA
main = print $ linearRegression [
    [-4.32, 3.02, 6.89],
    [-3.79, 2.01, 5.39],
    [-4.01, 2.41, 6.01],
    [-3.86, 2.09, 5.55],
    [-4.10, 2.58, 6.32]]

produces the following output