mltool-0.2.0.1: Machine Learning Toolbox

MachineLearning.Regression

Description

Synopsis

# Documentation

class Model a where Source #

Minimal complete definition

Methods

hypothesis :: a -> Matrix -> Vector -> Vector Source #

Hypothesis function, a.k.a. score function (for lassifition problem) Takes X (m x n) and theta (n x 1), returns y (m x 1).

cost :: a -> Regularization -> Matrix -> Vector -> Vector -> R Source #

Cost function J(Theta), a.k.a. loss function. It takes regularizarion parameter, matrix X (m x n), vector y (m x 1) and vector theta (n x 1).

gradient :: a -> Regularization -> Matrix -> Vector -> Vector -> Vector Source #

Gradient function. It takes regularizarion parameter, X (m x n), y (m x 1) and theta (n x 1). Returns vector of gradients (n x 1).

Instances

 Source # Methods Source # Methods Source # Methods Source # Methods

Constructors

 LeastSquares

Instances

 Source # Methods

Constructors

 GradientDescent R Gradient descent, takes alpha. Requires feature normalization. MinibatchGradientDescent Int Int R Minibacth Gradietn Descent, takes seed, batch size and alpha ConjugateGradientFR R R Fletcher-Reeves conjugate gradient algorithm, takes size of first trial step (0.1 is fine) and tol (0.1 is fine). ConjugateGradientPR R R Polak-Ribiere conjugate gradient algorithm. takes size of first trial step (0.1 is fine) and tol (0.1 is fine). BFGS2 R R Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm, takes size of first trial step (0.1 is fine) and tol (0.1 is fine).

Arguments

 :: Model a => MinimizeMethod -> a model (Least Squares, Logistic Regression etc) -> R epsilon, desired precision of the solution -> Int maximum number of iterations allowed -> Regularization regularization parameter -> Matrix X -> Vector y -> Vector initial solution, theta -> (Vector, Matrix) solution vector and optimization path

Returns solution vector (theta) and optimization path. Optimization path's row format: [iter number, cost function value, theta values...]

Normal equation using inverse, does not require feature normalization It takes X and y, returns theta.

Normal equation using pseudo inverse, requires feature normalization It takes X and y, returns theta.

Regularization

Constructors

 RegNone No regularization L2 R L2 Regularization