tokenify-0.1.2.0: A regex lexer

Safe HaskellNone
LanguageHaskell2010

Text.Tokenify

Description

Tokenify is a module used for generating a tokenizer from a regex based grammar

Synopsis

Documentation

tokenize :: CharSeq s => Tokenizer s a -> s -> Either String (Seq a) Source

tokenize will transform a CharSeq into a sequence of tokens

matchHead :: CharSeq s => Regex s -> s -> Maybe (s, s, Int) Source

Attmpts to match the front of a CharSeq with a Regex, if succeful, it returns a tuple containing

  • The matched CharSeq
  • The remaining CharSeq
  • The amount of characters consumed