# declarative: DIY Markov Chains.

[ library, math, mit ] [ Propose Tags ]

DIY Markov Chains.

Build composite Markov transition operators from existing ones for fun and profit.

A useful strategy is to hedge one's sampling risk by occasionally interleaving a computationally-expensive transition (such as a gradient-based algorithm like Hamiltonian Monte Carlo or NUTS) with cheap Metropolis transitions.

transition = frequency [
(9, metropolis 1.0)
, (1, hamiltonian 0.05 20)
]

Alternatively: sample consecutively using the same algorithm, but over a range of different proposal distributions.

transition = concatAllT [
slice 0.5
, slice 1.0
, slice 2.0
]

Or just mix and match and see what happens!

transition =
sampleT
(sampleT (metropolis 0.5) (slice 0.1))
(sampleT (hamiltonian 0.01 20) (metropolis 2.0))

Check the test suite for example usage.

Versions [faq] 0.1.0.0, 0.1.0.1, 0.2.1, 0.2.2, 0.2.3, 0.3.3, 0.3.4, 0.4.0, 0.5.0, 0.5.1, 0.5.2, 0.5.3, 0.5.4 base (<5), hasty-hamiltonian (>=1.1.1), lens (==4.*), mcmc-types (>=1.0.1), mighty-metropolis (>=1.0.1), mwc-probability (>=1.0.1), pipes (==4.*), primitive, speedy-slice (>=0.1.2), transformers [details] MIT Jared Tobin jared@jtobin.ca Math http://github.com/jtobin/declarative head: git clone http://github.com/jtobin/declarative.git by JaredTobin at 2015-10-09T11:27:58Z LTSHaskell:0.5.2, NixOS:0.5.4, Stackage:0.5.4 6603 total (8 in the last 30 days) (no votes yet) [estimated by Bayesian average] λ λ λ Docs available Last success reported on 2015-10-10

[Index]