local-search-0.0.5: A first attempt at generalised local search within Haskell, for applications in combinatorial optimisation.

Safe HaskellSafe-Inferred

Control.Search.Local

Contents

Description

We capture the pattern of meta-heuristics and local search as a process or stream of evolving solutions. These combinators provide a way to describe and manipulate these processes quickly. The basic pattern of their use is this;

 loopP (strategy) seed 

The strategy itself is a stream transformer. The transformer becomes a search strategy when it's output is fed back into it's input, which is the action of the loopP function. For example, the following is not a search strategy but you could write;

 loopP (map (+1)) 0

Which would generate the stream [0,1,2... A real search strategy then looks like;

 loopP iterativeImprover tspSeed

Many search strategies do not always produce improving sequences as the iterative improver does. For these we provide a simple modification of scanl which can be applied to any stream, called saveBest. Finally, these streams are usually descriptions of unlimited processes. To make them practical we limit them using standard Haskell combinators such as take and list index.

 take 20 . saveBest $ loopP searchStrategy seed

Search strategies are constructed via the composition of other functions. This often resembles the composition of an arrow pipeline, and this library can be rewritten in terms of arrows, however we have found no significant advantage in doing this.

A simple TABU like search strategy, that has a memory of the recent past (10 elements) of the search process, and filters neighbourhoods accordingly can be created like this;

 searchStrategy xs = map head $ adaptive filter adaptive filter (flip notElem)  (window 10 xs) (neighbourhoods xs)  

A common way to improve meta-heuristics is to introduce stochastic elements, such as random decisions from a constrained set of choices, or neighbourhoods which will not generate exactly the same set of options each time a particular solution is visited. Stream transformations allow this because they can thread additional state internally, while not exposing the user of the transformation to a great deal of the process. For example in the above example, to create a random choice from the constrained set at each point you would do this;

 searchStrategy rs xs = zipWith randomChoice rs $ adaptive filter adaptive filter (flip notElem) (window 10 xs) (neighbourhoods xs) 

The neighbourhood can be similarly modified. We must still provide the starting points for the extra data used by such transformers, in this case a stream of random values, or in other cases a random number generator, but one provided it is hidden, and the transformer can be composed with any other transformation.

Using the same transformation, which threads an internal state, in several places is harder. It involves merging and dividing streams in sequenced patterns. For example;

 applyToBoth tr as bs = (\xs->(map head xs,map last xs)) . chunk 2 $ tr (concat $ transpose [as,bs])

Synopsis

Types

type StreamT s = [s] -> [s]Source

The basic stream transformation type. This converts elements of one type into (expected) different elements of the same type.

type ExpandT s = [s] -> [[s]]Source

Many processes in meta-heuristics will create sets of options (e.g. neighbourhood functions) or collect sets of information about streams (e.g. window). This is the data type for these functions.

type ContraT s = [[s]] -> [s]Source

Choices and selections from larger sets of elements are modelled as these contractions, for example the selection of an element from a neighbourhood, or rather a stream of neighbourhoods.

Generic Combinators

lift :: (t -> b -> c) -> (a -> t) -> [a] -> [b] -> [c]Source

lift is a lifting function, originally designed to lift filters and partitions to operate over interrelated streams of data. An example of use is;

 lift filter (<) as bs 

bestSoFar :: Ord s => StreamT sSource

A transformer that is usually used as a final step in a process, to allow the user to only see the best possible solution at each point, and ignore the intermediate values that a strategy my produce.

chunk :: Int -> ExpandT sSource

Breaks down a stream into a series of chunks, frequently finds use in preparing sets of random numbers for various functions, but also in the makePop function that is important for genetic algorithms.

window :: Int -> ExpandT sSource

Creates a rolling window over a stream of values up to the size parameter. The windows are then produced as a stream of lists.

until_ :: [a] -> [Bool] -> [[a]] -> [a]Source

A way to link one stream with another at a joining point. The first stream is given as a list only, the second stream is given as a function which converts from a passed value into a list. The trigger is provided by a list of booleans paired with the passed values for creating the second list.

This allows for the creation of cyclical restarting cooling strategies in simulated annealing using a code snippet like this;

 let triggers = map (==0) . zipWith (-) (tail sols) $ sols
     restart basicS cs = until_ basicS cs $ map (restart basicS) (tails cs)
     coolStrat = restart (geoCooling 80000 (*0.5)) triggers
 in ....

divide :: [Bool] -> ExpandT sSource

Splits a stream into two parts. The output of the contraction is not well named here, it is in fact a collection of streams. The first stream being the part of the original stream that matched to False in the boolean stream, the second matching to True.

Any method can be used to create the boolean stream, e.g.

 cycle [True,False]
 map (<0.75) (randoms g) 

join :: [Bool] -> ContraT sSource

Integrates a collection of streams. The boolean stream indicates the order of integration. A True in the boolean stream will cause an element to be taken from the second stream, a False will cause it to take from the first.

nest :: [Bool] -> StreamT s -> StreamT sSource

A nesting procedure. This can nest one transformer into another. The boolean stream provides the pattern for where the parametrising transformer should be run, with True being the indicator. It is constructed using divide and join.

A simple example of this in operation is the following; We will have a stream of integers, incrementing by one each time. Every so often, we will increment by an additional 2.

 take 20 $ loopP (nest (concat $ repeat [False,False,True,False]) (map (+2)) . map (+1)) 0 

This is primarily used in the genetic algorithm system for creating mutation transformers.

There is also a current problem with the nesting function. If the -O2 flag is not used during compilation it causes a memory leak, for reasons currently unknown.

nestWithProb :: (Ord r, Floating r) => [r] -> r -> StreamT s -> StreamT sSource

A specialisation of the nesting routine, which takes a stream of random values, and a proportion/probability. This is used to construct the stream of booleans with that proportion set to True.

makePop :: Ord s => Int -> ExpandT sSource

Takes an input stream, breaks it down into chunks, then sorts these chunks and stretches them to give a stream of populations for processing in a genetic algorithm.

Loop Combinators

loopP :: StreamT s -> s -> [s]Source

A more specific version of loopS and implemented in terms of it. Rather than allowing a number of initial values, this allows only 1.

loopS :: StreamT s -> StreamT sSource

The standard function for tying the knot on a stream described process. This links the outputs of the stream process to the inputs, with an initial set of values, and provides a single stream of values to the user.

Filters & Choices

improvement :: Ord s => ExpandT s -> ExpandT sSource

A lifted filter over interrelated streams, currently only used as part of the iterative improvers.

varyWindow :: [Int] -> StreamT [s]Source

A specialist function that is used as part of a TABU variant called robust taboo. It is expected that the integer stream provided is an appropriately ranged random stream, so that it can limit the TABU list at each step by a random value. The version in the paper takes a range and random number generator. To avoid the import of System.Random, this takes the stream of values to vary the window by. To implement the former;

 varyWindow (randomRs range g) 

tabuFilter :: Ord s => [[s]] -> ExpandT s -> ExpandT sSource

A filter, similar in some ways to an improvement filter, but more complex, carrying out the common TABU rules. The first parameter is the stream of TABU lists, the second parameter the stream of source solutions. It operates over streams of neighbourhoods.

saChoose :: (Floating v, Ord v) => (s -> v) -> v -> v -> s -> s -> sSource

The traditional choice function used within simulated annealing. The parameters are; a function to yield quality of a solution, a value between 0 and 1 (stochastic expected) a temperature, the old solution and the possible future solution.

gaSelect :: Ord r => Int -> [r] -> [r] -> StreamT [s]Source

This can be considered the standard genetic algorithm selection process, though it is still quite parametrisable. It takes a size, the number of elements for each recombination, a distribution for the selection process to use and a stream of random numbers to control the selections.

The most basic version would select 2 parents using a geometric selection curve, like this;

 gaSelect 2 (iterate (*1.005) 0.005) rs

However there is no prescription on the distribution, or number of parents, e.g.

 gaSelect 3 (uniform 0.1) rs

Though I do not provide a uniform function at the present time, I intended this example to suggest three parents selected using a uniform distribution.

manySelect :: Int -> ContraT s -> StreamT [s]Source

This was original created to assist with making multiple selections from a population within a genetic algorithm. More generally this is a function which operates over streams of collections (lists). It takes a contraction operation and a size. It will apply the contraction a number of times to each collection, gather the results and release a new collection.

If the contraction stream operation has internal state, such as a stochastic factor, this will be used correctly, each collection will not have the same elements selected, nor will the same element be selected repeatedly from each collection.

select :: Ord r => [r] -> r -> [s] -> sSource

The basic selection function, not in stream form. This takes a distribution, a random number, a collection of elements to select from and gives back a single value, selected from the collection.

streamSelect :: Ord r => [r] -> [r] -> ContraT sSource

The lifting of select to operate over streams of values, rather than making a single selection. Provides a stream contraction operation. The first parameter is the distribution, the second a stream of random values.

Distributions

logCooling :: Floating b => b -> b -> [b]Source

A logarithmic cooling strategy intended for use within simulated annealing. Broadly the first value is the starting temperature and the second a value between 0 and 1.

geoCooling :: Floating b => b -> b -> [b]Source

The most common cooling strategy for simulated annealing, geometric. The first value is the starting temperature, the second a value between 0 and 1, the cooling rate.

linCooling :: Floating b => b -> b -> [b]Source

Included for completeness, this is a cooling strategy for simulated annealing that is usually not very effective, a linear changing strategy. The first value is the starting temperature the second is the value to increase it by at each step. In order to have it reduce at each step, pass a negative value.

Complex GA distributions, experimental

geometricDistribution :: Num n => n -> n -> [n]Source

distributionSelectWithRemainder :: Ord r => [r] -> r -> [s] -> (s, [s])Source