fastparser: A fast, but bare bones, bytestring parser combinators library.

[ bsd3, library, parsing ] [ Propose Tags ]

Please see README.md


[Skip to Readme]
Versions 0.3.0, 0.3.0.1, 0.3.1, 0.3.1.1
Change log CHANGELOG.md
Dependencies base (>=4.7 && <5), bytestring, bytestring-lexing (==0.5.*), containers (==0.5.*), kan-extensions (==5.*), microlens (==0.4.*), semigroups (==0.18.*), thyme (==0.3.*), transformers (>=0.4), vector-space (>=0.10 && <1) [details]
License BSD-3-Clause
Copyright Simon Marechal
Author Simon Marechal
Maintainer bartavelle@gmail.com
Category Parsing
Home page https://github.com/bartavelle/fastparser#readme
Source repo head: git clone https://github.com/bartavelle/fastparser
Uploaded by SimonMarechal at Sun Aug 12 07:02:55 UTC 2018
Distributions NixOS:0.3.1.1
Downloads 866 total (21 in the last 30 days)
Rating 2.0 (votes: 1) [estimated by rule of succession]
Your Rating
  • λ
  • λ
  • λ
Status Docs available [build log]
Last success reported on 2018-08-12 [all 1 reports]
Hackage Matrix CI

Modules

[Index] [Quick Jump]

Downloads

Maintainer's Corner

For package maintainers and hackage trustees


Readme for fastparser-0.3.1.1

[back to package description]

Build Status

A very simple, backtracking, fast parser combinator library.

It is measurably faster than attoparsec (36% in this use case), but only works on strict ByteString, lacks many helper functions, and is not resumable. It also should consume a tiny bit less memory for equivalent operations.

When NOT to use fastparser

  • When performance is not the most pressing concern.
  • When you need to parse anything else but strict ByteString.
  • When you need to use a battle-tested library. While very simple, and in constant use by me, this package is still quite experimental.
  • When you need to parse large inputs that are not easily cut into many smaller pieces that can be parsed independently.

How to use fastparser

fastparser works well with small pieces, such as individual log file lines. It is recommended to use it with a coroutine library (such as conduit or pipe), so that the input could be incrementaly consumed, cut into individual records, all of which would end up parsed independently.

One such setup, with the conduit ecosystem, would look like:

sourceFile "/tmp/foo" .| Data.Conduit.Binary.lines .| CL.map (parseOnly parser) .| ...

Other than that, fastparser is fairly close to any other parser combinators library.