The wp-archivebot package
A MediaWiki's RecentChanges or NewPages links to every new edit or article; this bot will poll the corresponding RSS feeds (easier and more reliable than parsing the HTML), follow the links to the new edit/article, and then use TagSoup to filter out every off-wiki link (eg. to http:cnn.com).
With this list of external links, the bot will then fire off requests to http:webcitation.org/, which will make a backup (similar to the Internet Archive, but on-demand).
Example: to archive links from every article in the English Wikipedia's RecentChanges:
wp-archivebot email@example.com 'http://en.wikipedia.org/w/index.php?title=Special:RecentChanges&feed=rss'
|Change log||None available|
|Dependencies||base (==3.*), feed, HTTP, network, parallel, tagsoup [details]|
|Uploaded||Thu Jun 4 16:31:50 UTC 2009 by GwernBranwen|
|Downloads||223 total (5 in last 30 days)|
|Status||Docs not available [build log]|
All reported builds failed as of 2015-11-14 [all 4 reports]
- wp-archivebot-0.1.tar.gz [browse] (Cabal source package)
- Package description (included in the package)
For package maintainers and hackage trustees