The csv-conduit package

[Tags: bsd3, library]

CSV files are the de-facto standard in many situations involving data transfer, particularly when dealing with enterprise application or disparate database systems.

While there are a number of CSV libraries in Haskell, at the time of this project's start in 2010, there wasn't one that provided all of the following:

This library is an attempt to close these gaps. Please note that this library started its life based on the enumerator package and has recently been ported to work with conduits instead. In the process, it has been greatly simplified thanks to the modular nature of the conduits library.

Following the port to conduits, the library has also gained the ability to parameterize on the stream type and work both with ByteString and Text.

For more documentation and examples, check out the README at:

http://github.com/ozataman/csv-conduit


[Skip to ReadMe]

Properties

Versions0.1, 0.2, 0.2.1.1, 0.3, 0.3.0.1, 0.3.0.2, 0.3.0.3, 0.4.1, 0.5.0, 0.5.1, 0.6.2, 0.6.2.1, 0.6.3, 0.6.5, 0.6.6
Change logNone available
Dependenciesarray, attoparsec (>=0.10), base (==4.*), blaze-builder, bytestring, conduit (>=1.0 && <2.0), conduit-extra, containers (>=0.3), csv-conduit, data-default, directory, ghc-prim (>=0.2 && <0.5), mmorph, monad-control, mtl, primitive, resourcet (>=1.1.2.1), text, transformers, unordered-containers, vector [details]
LicenseBSD3
AuthorOzgun Ataman
MaintainerOzgun Ataman <ozataman@gmail.com>
CategoryData, Conduit, CSV, Text
Home pagehttp://github.com/ozataman/csv-conduit
Executablesbench
UploadedSun Mar 22 16:41:09 UTC 2015 by OzgunAtaman
DistributionsDebian:0.6.6, LTSHaskell:0.6.6, NixOS:0.6.6, Stackage:0.6.6
Downloads3660 total (207 in last 30 days)
Votes
0 []
StatusDocs available [build log]
Last success reported on 2015-03-22 [all 1 reports]

Modules

[Index]

Flags

NameDescriptionDefaultType
benchDisabledManual

Use -f <flag> to enable a flag, or -f -<flag> to disable that flag. More info

Downloads

Maintainers' corner

For package maintainers and hackage trustees

Readme for csv-conduit-0.6.6

README Build Status

CSV Files and Haskell

CSV files are the de-facto standard in many cases of data transfer, particularly when dealing with enterprise application or disparate database systems.

While there are a number of csv libraries in Haskell, at the time of this project's start, there wasn't one that provided all of the following:

Over time, people created other plausible CSV packages like cassava. The major benefit from this library remains to be:

This package

csv-conduit is a conduit-based CSV parsing library that is easy to use, flexible and fast. It leverages the conduit infrastructure to provide constant-space operation, which is quite critical in many real world use cases.

For example, you can use http-conduit to download a CSV file from the internet and plug its Source into intoCSV to stream-convert the download into the Row data type and do something with it as the data streams, that is without having to download the entire file to disk first.

Author & Contributors

Introduction

Speed

While fast operation is of concern, I have so far cared more about correct operation and a flexible API. Please let me know if you notice any performance regressions or optimization opportunities.

Usage Examples

Example #1: Basics Using Convenience API

{-# LANGUAGE OverloadedStrings #-}

import Data.Conduit
import Data.Conduit.Binary
import Data.Conduit.List as CL
import Data.CSV.Conduit
import Data.Text (Text)

-- Just reverse te columns
myProcessor :: Monad m => Conduit (Row Text) m (Row Text)
myProcessor = CL.map reverse

test :: IO ()
test = runResourceT $ 
  transformCSV defCSVSettings 
               (sourceFile "input.csv") 
               myProcessor
               (sinkFile "output.csv")

Example #2: Basics Using Conduit API

{-# LANGUAGE OverloadedStrings #-}

import Data.Conduit
import Data.Conduit.Binary
import Data.CSV.Conduit
import Data.Text (Text)

myProcessor :: Conduit (Row Text) m (Row Text)
myProcessor = undefined

-- Let's simply stream from a file, parse the CSV, reserialize it
-- and push back into another file.
test :: IO ()
test = runResourceT $ 
  sourceFile "test/BigFile.csv" $= 
  intoCSV defCSVSettings $=
  myProcessor $=
  fromCSV defCSVSettings $$
  sinkFile "test/BigFileOut.csv"