J1A      !"#$%&'()*+,-./0123456789:;<=>?@None #How hard should we try to compress? #Decompress a streaming bytestring. A is from Codec.Compression.Zlib    :: B m => C m r -> C m r dDecompress a zipped byte stream, returning any leftover input that follows the compressed material. Compress a byte stream.See the Codec.Compression.Zlib module for details about  and A.   :: B m =>  -> A -> C m r -> C m r -A specific compression level between 0 and 9.!Decompress a gzipped byte stream.  :: B m => C m r -> C m r cDecompress a gzipped byte stream, returning any leftover input that follows the compressed stream.*Compress a byte stream in the gzip format.  :: B m =>   -> C m r -> C m r DProduce values from the given E until exhausted.F Compressed streamDecompressed stream Compressed byte streamBDecompressed byte stream, ending with either leftovers or a result Decompressed streamCompressed stream Compressed streamDecompressed streamCompressed byte stream.Decompressed bytes stream, returning either a C< of the leftover input or the return value from the input C.Decompressed streamCompressed streamGHD   F GHDNoneOTReceives bytes from a connected socket with a maximum chunk size. The bytestream ends if the remote peer closes its side of the connection or EOF is received. The implementation is as follows: fromSocket sock nbytes = loop where loop = do bs <- liftIO (NSB.recv sock nbytes) if B.null bs then return () else Q.chunk bs >> loopZConnect a stream of bytes to the remote end. The implementation is again very simple: toSocket sock = loop where loop bs = do e <- Q.nextChunk bs case e of Left r -> return r Right (b,rest) -> send sock b >> loop restConnected socket.XMaximum number of bytes to receive and send dowstream at once. Renzo recommends using 4096% if you don't have a special purpose.Connected socket.IJKLMNOPQRSTUVWXYZ[\]^NoneOTConstruct an ordinary pipes _ from a ` of elements3runEffect $ fromStream (S.each [1..3]) >-> P.print123 Construct a ` of elements from a pipes _"S.print $ toStream $ P.each [1..3]123FLink the chunks of a producer of bytestrings into a single byte stream7Successively yield the chunks hidden in a byte stream.  splits a _ into two _ s; the outer _b is the longest consecutive group of elements that satisfy the predicate. Its inverse is    splits a _ into two _ s; the outer __ is the longest consecutive group of elements that fail the predicate. Its inverse is  ## divides a _ into two _8s after a fixed number of elements. Its inverse is  $$ splits a _ into two _s; the second producer begins where we meet an element that is different according to the equality predicate. Its inverse is  %Like $#, where the equality predicate is (a)&& splits a _ into a ` of _%s of equal items. Its inverse is *' groupsBy' splits a _ into a ` of _7s grouped using the given relation. Its inverse is *This differs from &h by comparing successive elements instead of comparing each element to the first member of the groupimport Pipes (yield, each)import Pipes.Prelude (toList)let rel c1 c2 = succ c1 == c2E(toList . intercalates (yield '|') . groupsBy' rel) (each "12233345") "12|23|3|345"E(toList . intercalates (yield '|') . groupsBy rel) (each "12233345")"122|3|3|34|5")) splits a _ into a ` of _(s of a given length. Its inverse is *.4let listN n = L.purely P.folds L.list . P.chunksOf nHrunEffect $ listN 3 P.stdinLn >-> P.take 2 >-> P.map unwords >-> P.print1<Enter>2<Enter>3<Enter>"1 2 3"4<Enter>5<Enter>6<Enter>"4 5 6"Blet stylish = P.concats . P.maps (<* P.yield "-*-") . P.chunksOf 2QrunEffect $ stylish (P.each $ words "one two three four five six") >-> P.stdoutLn onetwo-*-threefour-*-fivesix-*-*Join a stream of _s into a single _+ Fold each _ in a producer ` Vpurely folds :: Monad m => Fold a b -> Stream (Producer a m) m r -> Producer b m r, Fold each _ in a _ stream, monadically Zimpurely foldsM :: Monad m => FoldM a b -> Stream (Producer a m) m r -> Producer b m r- (takes' n) only keeps the first n _s of a linked ` of  ProducersUnlike , -- is not functor-general - it is aware that a _ can be drainedH, as functors cannot generally be. Here, then, we drain the unused _Us in order to preserve the return value. This makes it a suitable argument for . !"#$%&'()*+ Step functionInitial accumulatorExtraction function, Step functionInitial accumulatorExtraction function- !"#$%&'()*+,-)(&'!"*+,- #%$ !"#$%&'()*+,-None. Send an HTTP b and wait for an HTTP c/ Create a d( from a content length and an effectful C0 Create a d from an effectful C0 is more flexible than /C, but requires the server to support chunked transfer encoding.1This is a quick method - oleg would call it 'unprofessional' - to bring a web page in view. It sparks its own internal manager and closes itself. Thus something like this makes senseArunResourceT $ Q.putStrLn $ simpleHttp "http://lpaste.net/raw/12"chunk _ [] = [];chunk n xs = let h = take n xs in h : (chunk n (drop n xs))but if you try something like\rest <- runResourceT $ Q.putStrLn $ Q.splitAt 40 $ simpleHTTP "http://lpaste.net/raw/146532"&import Data.ByteString.Streaming.HTTP "it will just be good luck if with runResourceT $ Q.putStrLn restyou get the rest of the file:  import qualified Data.ByteString.Streaming.Char8 as Q main = runResourceT $ Q.putStrLn $ simpleHTTP "http://lpaste.net/raw/146532" rather than  J*** Exception: <socket: 13>: hGetBuf: illegal operation (handle is closed)DSince, of course, the handle was already closed by the first use of  runResourceT6. The same applies of course to the more hygienic .- above, which permits one to extract an IO (ByteString IO r) , by using splitAt or the like. UThe reaction of some streaming-io libraries was simply to forbid operations like splitAt. That this paternalism was not viewed as simply outrageous is a consequence of the opacity of the older iteratee-io libraries. It is obviousj that I can no more run an effectful bytestring after I have made its effects impossible by using  runResourceT (which basically means closeEverythingDown). I might as well try to run it after tossing my machine into the flames. Similarly, it is obvious that I cannot read from a handle after I have applied hClose6; there is simply no difference between the two cases..Handler for response/0ef12ghijklmnopqrstuvwxyz{|}~dbc     ./012 .2/01./0ef12None4The result of a parse (Either a ([String], String)#), with the unconsumed byte stream.O:set -XOverloadedStrings -- the string literal below is a streaming bytestringJ(r,rest1) <- AS.parse (A.scientific <* A.many' A.space) "12.3 4.56 78.3"print r Left 12.3=(s,rest2) <- AS.parse (A.scientific <* A.many' A.space) rest1print s Left 4.56=(t,rest3) <- AS.parse (A.scientific <* A.many' A.space) rest2print t Left 78.3Q.putStrLn rest35Apply a parser repeatedly to a stream of bytes, streaming the parsed values, but ending when the parser fails.or the bytes run out.JS.print $ AS.parsed (A.scientific <* A.many' A.space) $ "12.3 4.56 78.9"12.34.5678.918.282345Attoparsec parser Raw input345345345None09;OT7An  attoparsec7 error that happened while parsing the raw JSON string.8An aeson0 error that happened while trying to convert a    to an  instance, as reported by  .%An error while decoding a JSON value.9This instance allows using  with  and 0 instance Error (DecodingError, Producer a m r)Consecutively parse a elements from the given Producer" using the given parser (such as  or .), skipping any leading whitespace each time.This Producerf runs until it either runs out of input or until a decoding failure occurs, in which case it returns  with a 6 and a Producer! with any leftovers. You can use  to turn the  return value into an  monad transformer.Like , except it accepts any  instance, not just  or .:VGiven a bytestring, parse a top level json entity - returning any leftover bytes. ;`Resolve a succession of top-level json items into a corresponding stream of Haskell values. <(Experimental. Parse a bytestring with a  json-streams parser. The function will read through the whole of a single top level json entity, streaming the valid parses as they arise. (It will thus for example parse an infinite json bytestring, though these are rare in practice ...) ~If the parser is fitted to recognize only one thing, then zero or one item will be yielded; if it uses combinators like arrayOf, it will stream many values as they arise. See the example at the top of this module, in which values inside a top level array are emitted as they are parsed. Aeson would accumulate the whole bytestring before declaring on the contents of the array. This of course makes sense, since attempt to parse a json array may end with a bad parse, invalidating the json as a whole. With  json-streams, a bad parse will also of course emerge in the end, but only after the initial good parses are streamed. This too makes sense though, but in a smaller range of contexts -- for example, where one is folding over the parsed material.+This function is closely modelled on  and  from Data.JsonStream.Parser. 6789:;<=6789:;<6789:;<6789:;<= !"!#!$%&'%&(%&)*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`a*+bcdefghijklmnopqorsortouvopwxyzxy{xy|xy}xy~xyxyxyxyxyxyxyxxxxx%&    l - !"#$%&'()*+,-.//01234534634734834934:34;34<34<34=34>34?34@34A34BCDcEFcEGHIJHKLHKMN.streaming-utils-0.1.4.4-EqdyA43DA6SFRoAVK6FoUKData.ByteString.Streaming.HTTPStreaming.Pipes Streaming.ZipStreaming.Network.TCP$Data.Attoparsec.ByteString.StreamingData.ByteString.Streaming.AesonZCompressionLevelZC Control.Monadjoin Data.AesonValueAFromJSONError Pipes.LifterrorP Pipes.AesondecodeddecodedLdecode parseValueControl.Monad.Trans.ErrorErrorTencodeData.JsonStream.ParserparseByteStringparseLazyByteString(resourcet-1.1.8.1-4d3Hvq1X23qLcIQkZt2K8LControl.Monad.Trans.Resource runResourceT%Control.Monad.Trans.Resource.Internal liftResourceT MonadResource ResourceT(streaming-0.1.4.5-EOHwWqAZPG7JZ8gvcQOosnStreaming.Internaltakes intercalatesmaps#zlib-0.6.1.2-3sjfdoEPlrt37YiyVABDl7Codec.Compression.Zlib.StreamdefaultWindowBits decompress decompress'compressdefaultCompression noCompression bestSpeedbestCompressioncompressionLevel windowBitsgunzipgunzip'gzip$fShowCompressionLevel$fReadCompressionLevel$fEqCompressionLevel$fOrdCompressionLevel fromSockettoSocket fromStreamtoStreamtoStreamingByteStringfromStreamingByteStringspanbreaksplitbreakssplitAtgroupBygroupgroupsBy groupsBy'groupschunksOfconcatsfoldsfoldsMtakes'withHTTPstreamNstream simpleHTTPhttpMessageparseparsed DecodingErrorAttoparsecError FromJSONError streamParse$fExceptionDecodingError$fShowDecodingError$fEqDecodingError$fDataDecodingError WindowBitsbaseControl.Monad.IO.ClassMonadIO2streaming-bytestring-0.1.4.6-NAykiB7o3W5dHJk7MEjeg"Data.ByteString.Streaming.Internal ByteString fromPopper/streaming-commons-0.1.16-AWTmutjvTNbL4jzgLLy0vhData.Streaming.ZlibPopper gzWindowBitsfor&network-2.6.3.1-G4Up1CPKbp7DeFsnywOnGGNetwork.Socket.TypesSocketNetwork.SocketHostName ServiceNameNetwork.Socket.Internal withSocketsDoSockAddr-network-simple-0.4.0.5-IIogw2h9bwEL7Q07lJHJu3Network.Simple.TCPsendManysendLazysendrecv closeSockbindSock connectSock acceptForkacceptlistenserveconnectNetwork.Simple.InternalHostPreferenceHostAnyHostIPv4HostIPv6Host"pipes-4.1.9-JUkeGRpw8RgEFjXn1geiEh Pipes.CoreProducerStreamghc-prim GHC.Classes==(http-client-0.5.4-LsesjLICC95KR6n1mhJm04Network.HTTP.Client.TypesRequestResponse RequestBodytofromNetwork.HTTP.ClientresponseTimeoutDefaultresponseTimeoutNoneresponseTimeoutMicromanagerSetProxymanagerSetSecureProxymanagerSetInsecureProxywithResponseHistoryresponseOpenHistoryHistoriedResponse hrRedirectshrFinalRequesthrFinalResponseNetwork.HTTP.Client.Core responseClose responseOpen httpNoBodyhttpLbs withResponseNetwork.HTTP.Client.CookiesgenerateCookieinsertCheckedCookiereceiveSetCookieupdateCookieJarcomputeCookieStringinsertCookiesIntoRequestevictExpiredCookies!removeExistingCookieFromCookieJardestroyCookieJarcreateCookieJar pathMatches defaultPath domainMatches isIpAddressNetwork.HTTP.Client.Manager defaultProxyproxyEnvironmentNamedproxyEnvironmentuseProxynoProxyproxyFromRequest withManager closeManager newManagerdefaultManagerSettingsrawConnectionModifySocketSizerawConnectionModifySocketNetwork.HTTP.Client.RequestobservedStreamFile streamFilesetQueryStringsetRequestIgnoreStatusurlEncodedBodyapplyBasicProxyAuthapplyBasicAuthdefaultRequestgetUri parseRequest_ parseRequest parseUrlThrowparseUrlNetwork.HTTP.Client.Body brConsume brReadSomebrReadNetwork.HTTP.Client.ConnectionsocketConnectionmakeConnection BodyReader HttpExceptionHttpExceptionRequestInvalidUrlExceptionHttpExceptionContentResponseTimeoutStatusCodeExceptionTooManyRedirectsOverlongHeadersConnectionTimeoutConnectionFailureInvalidStatusLine InvalidHeaderInternalExceptionProxyConnectExceptionNoResponseDataReceivedTlsNotSupportedWrongRequestBodyStreamSizeResponseBodyTooShortInvalidChunkHeadersIncompleteHeadersInvalidDestinationHostHttpZlibExceptionInvalidProxyEnvironmentVariableConnectionClosedCookie cookie_name cookie_valuecookie_expiry_time cookie_domain cookie_pathcookie_creation_timecookie_last_access_timecookie_persistentcookie_host_onlycookie_secure_onlycookie_http_only CookieJarProxy proxyHost proxyPortRequestBodyLBS RequestBodyBSRequestBodyBuilderRequestBodyStreamRequestBodyStreamChunked RequestBodyIO NeedsPopper GivesPopperpathmethodsecurehostport queryStringrequestHeaders requestBodyproxy redirectCount checkResponseresponseTimeout cookieJarrequestVersionresponseStatusresponseVersionresponseHeaders responseBodyresponseCookieJarManagerSettingsmanagerConnCountmanagerRawConnectionmanagerTlsConnectionmanagerResponseTimeoutmanagerRetryableExceptionmanagerWrapExceptionmanagerIdleConnectionCountmanagerModifyRequest ProxyOverrideManagerHasHttpManagergetHttpManagerStreamFileStatusfileSize readSoFar thisChunkSize+http-client-tls-0.3.3-RGPX0YravALXrD26EVorsNetwork.HTTP.Client.TLSapplyDigestAuthdisplayDigestAuthExceptionsetGlobalManagergetGlobalManagertlsManagerSettingsmkManagerSettingsContextmkManagerSettingsDigestAuthExceptionDigestAuthExceptionDetailsUnexpectedStatusCodeMissingWWWAuthenticateHeaderWWWAuthenticateIsNotDigest MissingRealm MissingNonceapply ParsingError Data.EitherLeftEither%aeson-0.11.2.1-B4zJgvAhVvJC40pFgLElOfData.Aeson.Types.ClassToJSONData.Aeson.Types.InternalArrayObject