The archiver package

[Tags: bsd3, library, program]

archiver is a daemon which will process a specified text file, each line of which is a URL, and will (randomly) one by one request that the URLs be archived or spidered by,, and for future reference. (One may optionally specify an arbitrary sh command like wget to download URLs locally.)

Because the interface is a simple text file, this can be combined with other scripts; for example, a script using Sqlite to extract visited URLs from Firefox, or a program extracting URLs from Pandoc documents. (See

For explanation of the derivation of the code in Network.URL.Archiver, see


Versions0.1, 0.2, 0.3, 0.3.1, 0.4, 0.5, 0.5.1, 0.6.0, 0.6.1, 0.6.2,
Dependenciesbase (==4.*), bytestring, containers, curl, HTTP, network, process, random
MaintainerGwern <>
CategoryDocumentation, Network
Source repositoryhead: darcs get
Upload dateFri Sep 9 03:00:51 UTC 2011
Uploaded byGwernBranwen
Downloads800 total (61 in last 30 days)




Maintainers' corner

For package maintainers and hackage trustees