archiver: Archive supplied URLs in WebCite & Internet Archive

[ bsd3, documentation, library, network, program ] [ Propose Tags ]
Versions 0.1, 0.2, 0.3, 0.3.1, 0.4, 0.5, 0.5.1, 0.6.0, 0.6.1, 0.6.2,
Dependencies base (==4.*), bytestring, containers, curl, HTTP, network, process, random [details]
License BSD-3-Clause
Author Gwern
Maintainer Gwern <>
Category Documentation, Network
Source repo head: darcs get
Uploaded by GwernBranwen at Wed Mar 7 17:16:16 UTC 2012
Distributions NixOS:
Executables archiver
Downloads 4329 total (39 in the last 30 days)
Rating (no votes yet) [estimated by rule of succession]
Your Rating
  • λ
  • λ
  • λ
Status Docs uploaded by user
Build status unknown [no reports yet]
Hackage Matrix CI

archiver is a daemon which will process a specified text file, each line of which is a URL, and will (randomly) one by one request that the URLs be archived or spidered by,, and for future reference. (One may optionally specify an arbitrary sh command like wget to download URLs locally.)

Because the interface is a simple text file, this can be combined with other scripts; for example, a script using Sqlite to extract visited URLs from Firefox, or a program extracting URLs from Pandoc documents. (See

For explanation of the derivation of the code in Network.URL.Archiver, see




Maintainer's Corner

For package maintainers and hackage trustees