archiver: Archive supplied URLs in WebCite & Internet Archive

[ bsd3, documentation, library, network, program ] [ Propose Tags ]
Versions 0.1, 0.2, 0.3, 0.3.1, 0.4, 0.5, 0.5.1, 0.6.0, 0.6.1, 0.6.2,
Dependencies base (==4.*), bytestring, containers, curl, HTTP, network, process, random [details]
License BSD-3-Clause
Author Gwern
Maintainer Gwern <>
Category Documentation, Network
Source repo head: git clone git://
Uploaded by GwernBranwen at Fri Jan 3 19:44:15 UTC 2014
Distributions NixOS:
Executables archiver
Downloads 4329 total (42 in the last 30 days)
Rating (no votes yet) [estimated by rule of succession]
Your Rating
  • λ
  • λ
  • λ
Status Docs available [build log]
Successful builds reported [all 1 reports]
Hackage Matrix CI

archiver is a daemon which will process a specified text file, each line of which is a URL, and will (randomly) one by one request that the URLs be archived or spidered by,, and for future reference. (One may optionally specify an arbitrary sh command like wget to download URLs locally.)

Because the interface is a simple text file, this can be combined with other scripts; for example, a script using Sqlite to extract visited URLs from Firefox, or a program extracting URLs from Pandoc documents. (See

For explanation of the derivation of the code in Network.URL.Archiver, see




Maintainer's Corner

For package maintainers and hackage trustees