The archiver package

[Tags: bsd3, library, program]

archiver is a daemon which will process a specified text file, each line of which is a URL, and will (randomly) one by one request that the URLs be archived or spidered by,, and for future reference. (One may optionally specify an arbitrary sh command like wget to download URLs locally.)

Because the interface is a simple text file, this can be combined with other scripts; for example, a script using Sqlite to extract visited URLs from Firefox, or a program extracting URLs from Pandoc documents. (See

For explanation of the derivation of the code in Network.URL.Archiver, see


Versions0.1, 0.2, 0.3, 0.3.1, 0.4, 0.5, 0.5.1, 0.6.0, 0.6.1, 0.6.2,
Change logNone available
Dependenciesbase (==4.*), bytestring, containers, curl, HTTP, network, process, random [details]
MaintainerGwern <>
CategoryDocumentation, Network
Source repositoryhead: darcs get
UploadedWed Mar 7 17:16:16 UTC 2012 by GwernBranwen
Downloads2033 total (68 in last 30 days)
0 []
StatusDocs uploaded by user
Build status unknown [no reports yet]




Maintainers' corner

For package maintainers and hackage trustees