u;      !"#$%&'()*+,-./0123456789: Safe-Inferred;None<0Control the rendering of floating point numbers.=(Use decimal notation for values between 0.1 and  9,999,999$, and scientific notation otherwise.>Standard decimal notation.?Scientific notation (e.g. 2.3e123).@@! takes a base and a non-negative AK number, and returns a list of digits and an exponent. In particular, if x>=0, and *floatToDigits base x = ([d1,d2,...,dn], e)then  n >= 1  x = 0.d1d2...dn * (base**e) 0 <= di <= base-1B%Unsafe conversion for decimal digits.<=>?CDEFGHIJKL@MNOPQRBCK<?>=CDEFGHIJKL@MNOPQRBNone2A  is a dictionary based on  V where column names are keys and row's individual cell values are the values of the Map.A  is just a list of fieldsSettings for a CSV file. This library is intended to be flexible and offer a way to process the majority of text data files out there.0Separator character to be used in between fieldsAQuote character that may sometimes be present around fields. If SJ is given, the library will never expect quotation even if it is present. Default settings for a CSV file. $csvSep = ',' csvQuoteChar = Just '"'None  Try to parse given string as CSV 'Try to parse given string as 'Row Text'  Parse CSV Parse a CSV row TUVWX  TUVWXNone  Try to parse given string as CSV-Try to parse given string as 'Row ByteString' Parse CSVParse a CSV row YZ[\]   YZ[\]Ozgun Ataman, Johan TibellBSD3!Ozgun Ataman <ozataman@gmail.com> experimentalNone12346=HJKM=qConversion of a field to a value might fail e.g. if the field is malformed. This possibility is captured by the  type, which lets you compose several field conversions together in such a way that if any of them fail, the whole record conversion fails.^Success continuation._Failure continuation.3A type that can be converted to a single CSV field.Example type and instance: {-# LANGUAGE OverloadedStrings #-} data Color = Red | Green | Blue instance ToField Color where toField Red = "R" toField Green = "G" toField Blue = "B"WA type that can be converted from a single CSV field, with the possibility of failure.When writing an instance, use `, a, or b' to make a conversion fail, e.g. if a c' can't be converted to the given type.Example type and instance: {-# LANGUAGE OverloadedStrings #-} data Color = Red | Green | Blue instance FromField Color where parseField s | s == "R" = pure Red | s == "G" = pure Green | s == "B" = pure Blue | otherwise = mzero4A type that can be converted to a single CSV record.An example type and instance: data Person = Person { name :: !Text, age :: !Int } instance ToRecord Person where toNamedRecord (Person name age) = namedRecord [ "name" .= name, "age" .= age]XA type that can be converted from a single CSV record, with the possibility of failure.When writing an instance, use `, a, or b' to make a conversion fail, e.g. if a !" has the wrong number of columns.Given this example data: name,age John,56 Jane,55$here's an example type and instance: {-# LANGUAGE OverloadedStrings #-} data Person = Person { name :: !Text, age :: !Int } instance FromRecord Person where parseNamedRecord m = Person <$> m .: "name" <*> m .: "age"Note the use of the OverloadedStrings# language extension which enables d) values to be written as string literals.4A type that can be converted to a single CSV record.An example type and instance: data Person = Person { name :: !Text, age :: !Int } instance ToRecord Person where toRecord (Person name age) = record [ toField name, toField age]Outputs data on this form: John,56 Jane,55dHaskell lacks a single-element tuple type, so if you CSV data with just one column you can use the + type to represent a single-column result.XA type that can be converted from a single CSV record, with the possibility of failure.When writing an instance, use `, a, or b' to make a conversion fail, e.g. if a !" has the wrong number of columns.Given this example data: John,56 Jane,55$here's an example type and instance: data Person = Person { name :: !Text, age :: !Int } instance FromRecord Person where parseRecord v | length v == 2 = Person <$> v .! 0 <*> v .! 1 | otherwise = mzerocA single field within a record.!4A record corresponds to a single line in a CSV file."iA wrapper around custom haskell types that can directly be converted/parsed from an incoming CSV stream.[We define this wrapper to stop GHC from complaining about overlapping instances. Just use $( to get your object out of the wrapper.%'A shorthand for the ByteString case of MapRow& Retrieve the n-th field in the given record. The result is `j if the value cannot be converted to the desired type. Raises an exception if the index is out of bounds.&9 is a simple convenience function that is equivalent to  (v e idx)@. If you're certain that the index is not out of bounds, using ( is somewhat faster.' Alias for &.(Like & but without bounds checking.)>Retrieve a field in the given record by name. The result is `R if the field is missing or if the value cannot be converted to the desired type.* Alias for ).+9Construct a pair from a name and a value. For use with .., Alias for +.-"Construct a record from a list of ds. Use  to convert values to ds for use with -..3Construct a named record from a list of name-value d pairs. Use ,2 to construct such a pair from a name and a value./Run a , returning either f errMsg or g result. Forces the value in the f or g( constructors to weak head normal form.^You most likely won't need to use this function directly, but it's included for completeness.hUses UTF-8 encoding.i8Assumes UTF-8 encoding. Fails on invalid byte sequences.jUses UTF-8 encoding.k8Assumes UTF-8 encoding. Fails on invalid byte sequences.lUses UTF-8 encoding.m8Assumes UTF-8 encoding. Fails on invalid byte sequences.nUses decimal encoding.o#Accepts an unsigned decimal number.pUses decimal encoding.q#Accepts an unsigned decimal number.rUses decimal encoding.s#Accepts an unsigned decimal number.tUses decimal encoding.u#Accepts an unsigned decimal number.vUses decimal encoding.w#Accepts an unsigned decimal number.x)Uses decimal encoding with optional sign.y Accepts a signed decimal number.z)Uses decimal encoding with optional sign.{ Accepts a signed decimal number.|)Uses decimal encoding with optional sign.} Accepts a signed decimal number.~)Uses decimal encoding with optional sign. Accepts a signed decimal number.)Uses decimal encoding with optional sign. Accepts a signed decimal number.)Uses decimal encoding with optional sign. Accepts a signed decimal number.GUses decimal notation or scientific notation, depending on the number.Accepts same syntax as rational.GUses decimal notation or scientific notation, depending on the number.Accepts same syntax as rational.Uses UTF-8 encoding.Assumes UTF-8 encoding. Ignores the c. Always succeeds.S is encoded as an  field.S if the c is ,  otherwise.^_ c!"#$%&'()*+,-./hijklmnopqrstuvwxyz{|}~ !"#$%&'()*+,-./"#$!% /&'()*+,-.|^_ c!"#$%&'()*+,-./hijklmnopqrstuvwxyz{|}~' None 2346HM0Represents types rN that are CSV-like and can be converted to/from an underlying stream of type s). There is nothing scary about the type:sO represents stream types that can be converted to/from CSV rows. Examples are d,  and .rb represents the target CSV row representations that this library can work with. Examples are the  types, the Record type and the S family of types. We can also convert directly to complex Haskell types using the  e module that was borrowed from the cassava package, which was itself inspired by the aeson package.(Example #1: Basics Using Convenience API Qimport Data.Conduit import Data.Conduit.Binary import Data.Conduit.List as CL import Data.CSV.Conduit myProcessor :: Conduit (Row Text) m (Row Text) myProcessor = CL.map reverse test = runResourceT $ transformCSV defCSVSettings (sourceFile "input.csv") myProcessor (sinkFile "output.csv")$Example #2: Basics Using Conduit API Yimport Data.Conduit import Data.Conduit.Binary import Data.CSV.Conduit myProcessor :: Conduit (MapRow Text) m (MapRow Text) myProcessor = undefined test = runResourceT $ sourceFile "test/BigFile.csv" $= intoCSV defCSVSettings $= myProcessor $= (writeHeaders defCSVSettings >> fromCSV defCSVSettings) $$ sinkFile "test/BigFileOut.csv"14Convert a CSV row into strict ByteString equivalent.2Turn a stream of s\ into a stream of CSV row type. An example would be parsing a ByteString stream as rows of  .34Turn a stream of CSV row type back into a stream of s-. An example would be rendering a stream of  d rows as .4\Write headers AND the row into the output stream, once. If you don't call this while using R family of row types, then your resulting output will NOT have any headers in it.!Usage: Just chain this using the  instance in your pipeline: C... =$= writeHeaders settings >> fromCSV settings $$ sinkFile "..."5Read the entire contents of a CSV file into memory. readCSVFile :: (GV.Vector v a, CSV ByteString a) => CSVSettings -- ^ Settings to use in deciphering stream -> FilePath -- ^ Input file -> IO (v a)6cA simple way to decode a CSV string. Don't be alarmed by the polymorphic nature of the signature. s! is the type for the string and v is a kind of Vector here.For example for d:s <- LB.readFile "my.csv"/decodeCSV 'def' s :: Vector (Vector ByteString)will just work.7&Write CSV data into file. As we use a d, sink, you'll need to get your data into a d stream type.8RMap over the rows of a CSV file. Provided for convenience for historical reasons.*An easy way to run this function would be % after feeding it all the arguments.9ILike transformCSV' but uses the same settings for both input and output.:MGeneral purpose CSV transformer. Apply a list-like processing function from   o to the rows of a CSV stream. You need to provide a stream data source, a transformer and a stream data sink.*An easy way to run this function would be % after feeding it all the arguments.5Example - map a function over the rows of a CSV file: JtransformCSV setIn setOut (sourceFile inFile) (C.map f) (sinkFile outFile)IAn efficient sink that incrementally grows a vector from the input streamEConversion of stream directly to/from a custom complex haskell type.Generic " instance; any stream type with a  instance automatically gets a  instance. Support for parsing rows in the Vector form. instance using  based on d stream. Please note this uses the ByteString operations underneath and has lots of unnecessary overhead. Included for convenience. instance using  based on d stream instance using  instance using d01234567 CSV Settings Target fileWrite vs. append mode List of rows8.Settings to use both for both input and outputA mapping function Input file Output file9-Settings to be used for both input and output1A raw stream data source. Ex: 'sourceFile inFile'A transforming conduit.A raw stream data sink. Ex: 'sinkFile outFile':Settings to be used for inputSettings to be used for output1A raw stream data source. Ex: 'sourceFile inFile'A transforming conduit.A raw stream data sink. Ex: 'sinkFile outFile'0123456789:6579:8401230123456789:  !"#$%&&'()*++,-./0123456789:;<=>?@ABCDEFGHIJCKLMNOPQRSTUVWXYZ[\]C^_`abcd`abcdefCghCijCklmnopqrsCtuCtvwxyz{|}~nhC^CkCkcsv-conduit-0.6.5Data.CSV.ConduitData.CSV.Conduit.TypesData.CSV.Conduit.Parser.Text"Data.CSV.Conduit.Parser.ByteStringData.CSV.Conduit.ConversionData.CSV.Conduit.Monoid$Data.CSV.Conduit.Conversion.InternalDataMap Conversion Data.ConduitListresourcet-1.1.4.1Control.Monad.Trans.Resource runResourceTMapRowRow CSVSettingscsvSep csvQuoteChardefCSVSettings$fDefaultCSVSettingsparseCSVparseRowcsvrowParserToFieldtoField FromField parseField ToNamedRecord toNamedRecordFromNamedRecordparseNamedRecordToRecordtoRecordOnlyfromOnly FromRecord parseRecordRecordNamedgetNamed NamedRecordindex.! unsafeIndexlookup.: namedField.=record namedRecord runParserCSVrowToStrintoCSVfromCSV writeHeaders readCSVFile decodeCSV writeCSVFile mapCSVFile transformCSV transformCSV'base Data.Monoid<>FPFormatGenericFixedExponent floatToDigits GHC.Float RealFloati2ddecimal formatDecimalformatBoundedSignedformatPositiveminuszerodigiti2w realFloatformatRealFloatminExptmaxExptexptexpts maxExpt10expts10 Data.MaybeNothingbadrowcsvrowfield isFieldChar quotedFieldSuccessFailureControl.Applicativeempty Control.MonadmzeroGHC.BasefailFieldbytestring-0.10.4.0Data.ByteString.Internal ByteStringvector-0.10.12.2 Data.Vector! Data.EitherLeftRight $fToField[] $fFromField[] $fToFieldText$fFromFieldText$fToFieldText0$fFromFieldText0$fToFieldWord64$fFromFieldWord64$fToFieldWord32$fFromFieldWord32$fToFieldWord16$fFromFieldWord16$fToFieldWord8$fFromFieldWord8 $fToFieldWord$fFromFieldWord$fToFieldInt64$fFromFieldInt64$fToFieldInt32$fFromFieldInt32$fToFieldInt16$fFromFieldInt16 $fToFieldInt8$fFromFieldInt8$fToFieldInteger$fFromFieldInteger $fToFieldInt$fFromFieldInt$fToFieldFloat$fFromFieldFloat$fToFieldDouble$fFromFieldDouble $fToFieldChar$fFromFieldChar $fFromField()$fToFieldMaybeData.ByteString$fFromFieldMaybeJust GToRecord gtoRecordProxyGFromRecordProdgparseRecordProdGFromRecordSumgparseRecordSumGFromNamedRecordgparseNamedRecord GFromRecord gparseRecordunParsertoStrict fromStrictlengthMismatch parseDouble parseSigned parseUnsigned typeErrorapP$fGToRecordM1(,)$fGToRecordK1ByteString$fGToRecordM1ByteString$fGToRecordM1f$fGToRecordM1f0$fGToRecord:+:f$fGToRecord:*:f$fGToRecordU1f$fGFromRecordProdM1Map$fGFromRecordProdK1Vector$fGFromRecordProdM1Vector$fGFromRecordProd:*:r$fGFromRecordProdU1r$fGFromRecordSumM1r$fGFromRecordSum:+:r$fGFromNamedRecordM1$fGFromRecordM1$fMonoidParser$fMonadPlusParser$fAlternativeParser$fApplicativeParser$fFunctorParser $fMonadParser$fToFieldByteString$fFromFieldByteString$fToFieldByteString0$fFromFieldByteString0$fToNamedRecordMap$fFromNamedRecordMap$fToRecordVector$fFromRecordVector$fToRecordVector0$fFromRecordVector0 $fToRecord[]$fFromRecord[]$fToRecord(,,,,,,)$fFromRecord(,,,,,,)$fToRecord(,,,,,)$fFromRecord(,,,,,)$fToRecord(,,,,)$fFromRecord(,,,,)$fToRecord(,,,)$fFromRecord(,,,)$fToRecord(,,)$fFromRecord(,,) $fToRecord(,)$fFromRecord(,)$fToRecordOnly$fFromRecordOnly text-1.2.0.4Data.Text.InternalTextStringMonad sinkVector $fCSVsNamed $fCSVsMap $fCSVsVector$fCSVByteString[]$fCSVByteString[]0 $fCSVText[]$fCSVByteString[]1 fromCSVRow intoCSVRow intoCSVMap fromCSVMap