mdp-0.1.1.0: Tools for solving Markov Decision Processes.

Algorithms.MDP.Examples.Ex_3_2

Description

The problem described by Bertsekas p. 210.

Synopsis

mdp :: MDP State Control Double Source #

The MDP representing the problem.