tokenify-0.1.2.0: A regex lexer
Text.Tokenify
Description
Tokenify is a module used for generating a tokenizer from a regex based grammar
Tokenify
Synopsis
tokenize :: CharSeq s => Tokenizer s a -> s -> Either String (Seq a) Source
tokenize will transform a CharSeq into a sequence of tokens
tokenize
CharSeq
matchHead :: CharSeq s => Regex s -> s -> Maybe (s, s, Int) Source
Attmpts to match the front of a CharSeq with a Regex, if succeful, it returns a tuple containing
Regex