h$> 9      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~ Safe-InferredPNonepOzgun Ataman, Johan TibellBSD3!Ozgun Ataman  experimentalNone /9>?H; csv-conduitConversion of a field to a value might fail e.g. if the field is malformed. This possibility is captured by the  type, which lets you compose several field conversions together in such a way that if any of them fail, the whole record conversion fails. csv-conduit3A type that can be converted to a single CSV field.Example type and instance: {-# LANGUAGE OverloadedStrings #-} data Color = Red | Green | Blue instance ToField Color where toField Red = "R" toField Green = "G" toField Blue = "B" csv-conduitA type that can be converted from a single CSV field, with the possibility of failure.When writing an instance, use , , or ' to make a conversion fail, e.g. if a ' can't be converted to the given type.Example type and instance: {-# LANGUAGE OverloadedStrings #-} data Color = Red | Green | Blue instance FromField Color where parseField s | s == "R" = pure Red | s == "G" = pure Green | s == "B" = pure Blue | otherwise = mzero csv-conduit4A type that can be converted to a single CSV record.An example type and instance: data Person = Person { name :: !Text, age :: !Int } instance ToNamedRecord Person where toNamedRecord (Person name age) = namedRecord [ "name" .= name, "age" .= age]  csv-conduitA type that can be converted from a single CSV record, with the possibility of failure.When writing an instance, use , , or ' to make a conversion fail, e.g. if a " has the wrong number of columns.Given this example data: name,age John,56 Jane,55$here's an example type and instance: {-# LANGUAGE OverloadedStrings #-} data Person = Person { name :: !Text, age :: !Int } instance FromNamedRecord Person where parseNamedRecord m = Person <$> m .: "name" <*> m .: "age"Note the use of the OverloadedStrings# language extension which enables ) values to be written as string literals. csv-conduit4A type that can be converted to a single CSV record.An example type and instance: data Person = Person { name :: !Text, age :: !Int } instance ToRecord Person where toRecord (Person name age) = record [ toField name, toField age]Outputs data on this form: John,56 Jane,55 csv-conduitHaskell lacks a single-element tuple type, so if you CSV data with just one column you can use the + type to represent a single-column result. csv-conduitA type that can be converted from a single CSV record, with the possibility of failure.When writing an instance, use , , or ' to make a conversion fail, e.g. if a " has the wrong number of columns.Given this example data: John,56 Jane,55$here's an example type and instance: data Person = Person { name :: !Text, age :: !Int } instance FromRecord Person where parseRecord v | length v == 2 = Person <$> v .! 0 <*> v .! 1 | otherwise = mzero csv-conduitA single field within a record. csv-conduit4A record corresponds to a single line in a CSV file. csv-conduitA wrapper around custom haskell types that can directly be converted/parsed from an incoming CSV stream.We define this wrapper to stop GHC from complaining about overlapping instances. Just use ( to get your object out of the wrapper. csv-conduit'A shorthand for the ByteString case of MapRow csv-conduit Retrieve the n-th field in the given record. The result is  if the value cannot be converted to the desired type. Raises an exception if the index is out of bounds.9 is a simple convenience function that is equivalent to  (v  idx). If you're certain that the index is not out of bounds, using ! is somewhat faster.  csv-conduit Alias for .! csv-conduitLike  but without bounds checking." csv-conduit>Retrieve a field in the given record by name. The result is  if the field is missing or if the value cannot be converted to the desired type.$ csv-conduit Alias for ".% csv-conduit9Construct a pair from a name and a value. For use with (.& csv-conduit Alias for %.' csv-conduit"Construct a record from a list of s. Use  to convert values to s for use with '.( csv-conduit3Construct a named record from a list of name-value  pairs. Use &2 to construct such a pair from a name and a value.* csv-conduitRun a , returning either  errMsg or  result. Forces the value in the  or ( constructors to weak head normal form.You most likely won't need to use this function directly, but it's included for completeness.+ csv-conduitUses UTF-8 encoding., csv-conduitUses UTF-8 encoding.- csv-conduitUses UTF-8 encoding.0 csv-conduitUses decimal encoding.1 csv-conduitUses decimal encoding.2 csv-conduitUses decimal encoding.3 csv-conduitUses decimal encoding.4 csv-conduitUses decimal encoding.5 csv-conduit)Uses decimal encoding with optional sign.6 csv-conduit)Uses decimal encoding with optional sign.7 csv-conduit)Uses decimal encoding with optional sign.8 csv-conduit)Uses decimal encoding with optional sign.9 csv-conduit)Uses decimal encoding with optional sign.: csv-conduit)Uses decimal encoding with optional sign.; csv-conduitUses decimal notation or scientific notation, depending on the number.< csv-conduitUses decimal notation or scientific notation, depending on the number.= csv-conduitUses UTF-8 encoding.> csv-conduit is encoded as an  field.G csv-conduit8Assumes UTF-8 encoding. Fails on invalid byte sequences.H csv-conduit8Assumes UTF-8 encoding. Fails on invalid byte sequences.I csv-conduit8Assumes UTF-8 encoding. Fails on invalid byte sequences.L csv-conduit#Accepts an unsigned decimal number.M csv-conduit#Accepts an unsigned decimal number.N csv-conduit#Accepts an unsigned decimal number.O csv-conduit#Accepts an unsigned decimal number.P csv-conduit#Accepts an unsigned decimal number.Q csv-conduit Accepts a signed decimal number.R csv-conduit Accepts a signed decimal number.S csv-conduit Accepts a signed decimal number.T csv-conduit Accepts a signed decimal number.U csv-conduit Accepts a signed decimal number.V csv-conduit Accepts a signed decimal number.W csv-conduitAccepts same syntax as rational.X csv-conduitAccepts same syntax as rational.Y csv-conduitAssumes UTF-8 encoding.Z csv-conduit Ignores the . Always succeeds.[ csv-conduit if the  is ,  otherwise.*  !"#$%&'()**  * !"#$%&'() 9  Safe-Inferred ># csv-conduitAn  is a dictionary based on   where column names are keys and row's individual cell values are the values of the OMap . Unlike , / preserves the insertion ordering of columns. ' is a reasonable default in most cases. csv-conduitA  is a dictionary based on    where column names are keys and row's individual cell values are the values of the Map. csv-conduitA  is just a list of fields csv-conduitSettings for a CSV file. This library is intended to be flexible and offer a way to process the majority of text data files out there. csv-conduit0Separator character to be used in between fields csv-conduitQuote character that may sometimes be present around fields. If  is given, the library will never expect quotation even if it is present. csv-conduit Default settings for a CSV file. $csvSep = ',' csvQuoteChar = Just '"' Safe-Inferred$ csv-conduit Try to parse given string as CSV csv-conduit'Try to parse given string as 'Row Text' csv-conduit Parse CSV csv-conduitParse a CSV row Safe-Inferred%x csv-conduit Try to parse given string as CSV csv-conduit-Try to parse given string as 'Row ByteString' csv-conduit Parse CSV csv-conduitParse a CSV rowNone  >?9h csv-conduitRepresents types r that are CSV-like and can be converted to/from an underlying stream of type s). There is nothing scary about the type:s represents stream types that can be converted to/from CSV rows. Examples are ,  and .r represents the target CSV row representations that this library can work with. Examples are the  types, the Record type and the  family of types. We can also convert directly to complex Haskell types using the   module that was borrowed from the cassava package, which was itself inspired by the aeson package.(Example #1: Basics Using Convenience API import Data.Conduit import Data.Conduit.Binary import Data.Conduit.List as CL import Data.CSV.Conduit myProcessor :: Conduit (Row Text) m (Row Text) myProcessor = CL.map reverse test = runResourceT $ transformCSV defCSVSettings (sourceFile "input.csv") myProcessor (sinkFile "output.csv")$Example #2: Basics Using Conduit API import Data.Conduit import Data.Conduit.Binary import Data.CSV.Conduit myProcessor :: Conduit (MapRow Text) m (MapRow Text) myProcessor = undefined test = runResourceT $ runConduit $ sourceFile "test/BigFile.csv" .| intoCSV defCSVSettings .| myProcessor .| (writeHeaders defCSVSettings >> fromCSV defCSVSettings) .| sinkFile "test/BigFileOut.csv" csv-conduit4Convert a CSV row into strict ByteString equivalent. csv-conduitTurn a stream of s into a stream of CSV row type. An example would be parsing a ByteString stream as rows of  . csv-conduit4Turn a stream of CSV row type back into a stream of s-. An example would be rendering a stream of   rows as . csv-conduitWrite headers AND the row into the output stream, once. If you don't call this while using  family of row types, then your resulting output will NOT have any headers in it.!Usage: Just chain this using the  instance in your pipeline: runConduit $ ... .| writeHeaders settings >> fromCSV settings .| sinkFile "..." csv-conduit3Read the entire contents of a CSV file into memory. csv-conduitA simple way to decode a CSV string. Don't be alarmed by the polymorphic nature of the signature. s! is the type for the string and v is a kind of Vector here.For example for :s <- LB.readFile "my.csv"decodeCSV defCSVSettings s :: Either SomeException (Vector (Vector ByteString))1will work as long as the data is comma separated. csv-conduit&Write CSV data into file. As we use a , sink, you'll need to get your data into a  stream type. csv-conduitMap over the rows of a CSV file. Provided for convenience for historical reasons.*An easy way to run this function would be % after feeding it all the arguments. csv-conduitLike transformCSV' but uses the same settings for both input and output. csv-conduitGeneral purpose CSV transformer. Apply a list-like processing function from   to the rows of a CSV stream. You need to provide a stream data source, a transformer and a stream data sink.*An easy way to run this function would be % after feeding it all the arguments.5Example - map a function over the rows of a CSV file: transformCSV setIn setOut (sourceFile inFile) (C.map f) (sinkFile outFile) csv-conduitConversion of stream directly to/from a custom complex haskell type. csv-conduitGeneric " instance; any stream type with a  instance automatically gets a  instance. csv-conduit Support for parsing rows in the Vector form. csv-conduit instance using  based on  stream. Please note this uses the ByteString operations underneath and has lots of unnecessary overhead. Included for convenience. csv-conduit instance using  based on  stream csv-conduit instance using  csv-conduit instance using  csv-conduit%Settings to use in deciphering stream csv-conduit Input file csv-conduit CSV Settings csv-conduit Target file csv-conduitWrite vs. append mode csv-conduit List of rows csv-conduit.Settings to use both for both input and output csv-conduitA mapping function csv-conduit Input file csv-conduit Output file csv-conduit-Settings to be used for both input and output csv-conduit1A raw stream data source. Ex: 'sourceFile inFile' csv-conduitA transforming conduit csv-conduit.A raw stream data sink. Ex: 'sinkFile outFile' csv-conduitSettings to be used for input csv-conduitSettings to be used for output csv-conduit1A raw stream data source. Ex: 'sourceFile inFile' csv-conduitA transforming conduit csv-conduit.A raw stream data sink. Ex: 'sinkFile outFile' !!"#$%&''())*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~*csv-conduit-0.7.3.0-AdeMWwTLpO3BMHX0CGh7ZVData.CSV.ConduitData.CSV.Conduit.ConversionData.CSV.Conduit.TypesData.CSV.Conduit.Parser.Text"Data.CSV.Conduit.Parser.ByteStringData.CSV.Conduit.Monoid$Data.CSV.Conduit.Conversion.InternalData.MapOrderedDataMap Conversion Data.ConduitList(resourcet-1.2.4.2-JFdnjm80MxJE6wI53v046bControl.Monad.Trans.Resource runResourceTParserToFieldtoField FromField parseFieldToNamedRecordOrderedtoNamedRecordOrdered ToNamedRecord toNamedRecordFromNamedRecordOrderedparseNamedRecordOrderedFromNamedRecordparseNamedRecordToRecordtoRecordOnlyfromOnly FromRecord parseRecordFieldRecord NamedOrderedgetNamedOrderedNamedgetNamedNamedRecordOrdered NamedRecordindex.! unsafeIndexlookup lookupOrdered.: namedField.=record namedRecordnamedRecordOrdered runParser $fToField[] $fToFieldText$fToFieldText0$fToFieldByteString$fToFieldByteString0$fToFieldWord64$fToFieldWord32$fToFieldWord16$fToFieldWord8 $fToFieldWord$fToFieldInt64$fToFieldInt32$fToFieldInt16 $fToFieldInt8$fToFieldInteger $fToFieldInt$fToFieldFloat$fToFieldDouble $fToFieldChar$fToFieldMaybe$fToNamedRecordOrderedOMap$fSemigroupParser$fMonadPlusParser$fAlternativeParser$fApplicativeParser$fFunctorParser$fMonadFailParser $fMonadParser $fFromField[]$fFromFieldText$fFromFieldText0$fFromFieldByteString$fFromFieldByteString0$fFromFieldWord64$fFromFieldWord32$fFromFieldWord16$fFromFieldWord8$fFromFieldWord$fFromFieldInt64$fFromFieldInt32$fFromFieldInt16$fFromFieldInt8$fFromFieldInteger$fFromFieldInt$fFromFieldFloat$fFromFieldDouble$fFromFieldChar $fFromField()$fFromFieldMaybe$fFromNamedRecordOrderedOMap$fFromRecordVector$fFromRecordVector0$fFromRecord[]$fFromRecord(,,,,,,)$fFromRecord(,,,,,)$fFromRecord(,,,,)$fFromRecord(,,,)$fFromRecord(,,)$fFromRecord(,)$fFromRecordOnly$fFromNamedRecordMap$fGFromRecordSum:+:r$fGFromNamedRecordM1$fGFromRecordM1$fGFromRecordProdK1Vector$fGFromRecordProdM1Vector$fGFromRecordProd:*:r$fGFromRecordProdU1r$fGFromRecordSumM1r$fGFromRecordProdM1Map$fGToRecordM1(,)$fGToRecordK1ByteString$fGToRecordM1ByteString$fGToRecordM1f$fGToRecordM1f0$fGToRecord:+:f$fGToRecord:*:f$fGToRecordU1f$fToNamedRecordMap$fToRecordVector$fToRecordVector0 $fToRecord[]$fToRecord(,,,,,,)$fToRecord(,,,,,)$fToRecord(,,,,)$fToRecord(,,,)$fToRecord(,,) $fToRecord(,)$fToRecordOnly$fEqOnly $fOrdOnly $fReadOnly $fShowOnly$fEqNamedOrdered$fShowNamedOrdered$fReadNamedOrdered$fOrdNamedOrdered $fEqNamed $fShowNamed $fReadNamed $fOrdNamed OrderedMapRowMapRowRow CSVSettingscsvSep csvQuoteChardefCSVSettings$fDefaultCSVSettings$fReadCSVSettings$fShowCSVSettings$fEqCSVSettingsparseCSVparseRowcsvrowCSVrowToStrintoCSVfromCSV writeHeaderswriteHeadersOrdered readCSVFile decodeCSV writeCSVFile mapCSVFile transformCSV transformCSV'$fCSVsNamedOrdered $fCSVsNamed $fCSVsOMap $fCSVsMap $fCSVsVector$fCSVByteString[]$fCSVByteString[]0 $fCSVText[]$fCSVByteString[]1baseGHC.Base<>decimal realFloatemptymzeroControl.Monad.Failfailbytestring-0.10.10.0Data.ByteString.Internal ByteString&vector-0.12.3.0-Iq8W8y7X87B1xSQfAcXis3 Data.Vector! Data.EitherLeftRight GHC.MaybeNothingData.ByteStringJust text-1.2.3.2Data.Text.InternalTextStringMonad