regress-0.1.1: Linear and logistic regression through automatic differentiation

Safe HaskellNone




type Model f a = f a Source

A model using the given f to store parameters of type a. Can be thought of as some kind of vector throughough this package.

compute Source


:: (Applicative v, Foldable v, Num a) 
=> Model v a

theta vector, the model's parameters

-> v a

x vector, with the observed numbers

-> a

predicted y for this observation

Compute the predicted value for the given model on the given observation

regress Source


:: (Traversable v, Applicative v, Foldable v, Applicative f, Foldable f, Ord a, Floating a) 
=> f a

expected y value for each observation

-> f (v a)

input data for each observation

-> Model v a

initial parameters for the model, from which we'll improve

-> [Model v a]

a stream of increasingly accurate values for the model's parameter to better fit the observations.

Given some observed "predictions" ys, the corresponding input values xs and initial values for the model's parameters theta0,

regress ys xs theta0

returns a stream of values for the parameters that'll fit the data better and better.


-- the theta we're approximating
realtheta :: Model V.Vector Double
realtheta = V.fromList [1.0, 2.0, 3.0]

-- let's start there and make regress
-- get values that better fit the input data
theta0 :: Model V.Vector Double
theta0 = V.fromList [0.2, 3.0, 2.23]

-- input data. (output value, vector of values for each input)
ys_ex :: V.Vector Double
xs_ex :: V.Vector (V.Vector Double)
(ys_ex, xs_ex) = V.unzip . V.fromList $
  [ (3, V.fromList [0, 0, 1])
  , (1, V.fromList [1, 0, 0])
  , (2, V.fromList [0, 1, 0])
  , (6, V.fromList [1, 1, 1])

-- stream of increasingly accurate parameters
thetaApproxs :: [Model V.Vector Double]
thetaApproxs = learnAll ys_ex xs_ex theta0