git-annex can transfer data to and from configured git remotes. Normally those remotes are normal git repositories (bare and non-bare; local and remote), that store the file contents in their own git-annex directory.

But, git-annex also extends git's concept of remotes, with these special types of remotes. These can be used just like any normal remote by git-annex. They cannot be used by other git commands though.

The above special remotes can be used to tie git-annex into many cloud services. Here are specific instructions for various cloud things:

Unused content on special remotes

Over time, special remotes can accumulate file content that is no longer referred to by files in git. Normally, unused content in the current repository is found by running git annex unused. To detect unused content on special remotes, instead use git annex unused --from. Example:

$ git annex unused --from mys3
unused mys3 (checking for unused data...) 
  Some annexed data on mys3 is not used by any files in this repository.
    NUMBER  KEY
    1       WORM-s3-m1301674316--foo
  (To see where data was previously used, try: git log --stat -S'KEY')
  (To remove unwanted data: git-annex dropunused --from mys3 NUMBER)
$ git annex dropunused --from mys3 1
dropunused 12948 (from mys3...) ok
MediaFire offers 50GB of free storage (max size 200MB). It would be great to support it as a new special remote.
Comment by Jon Ander Thu Jan 17 08:17:54 2013
Mediafire does not appear to offer any kind of API for its storage.
Comment by joeyh.name Thu Jan 17 12:44:25 2013
Wouldn't this be enough? http://developers.mediafire.com/index.php/REST_API
Comment by Jon Ander Thu Jan 17 12:53:41 2013

Similar to a JABOD, this would be Just A Bunch Of Files. I already have a NAS with a file structure conducive to serving media to my TV. However, it's not capable (currently) of running git-annex locally. It would be great to be able to tell annex the path to a file there as a remote much like a web remote from "git annex addurl". That way I can safely drop all the files I took with me on my trip, while annex still verifies and counts the file on the NAS as a location.

There are some interesting things to figure out for this to be efficient. For example, SHAs of the files. Maybe store that in a metadata file in the directory of the files? Or perhaps use the WORM backend by default?

Comment by Andrew Sat Jan 19 04:34:32 2013
The web special remote is recently able to use file:// URL's, so you can just point to files on some arbitrary storage if you want to.
Comment by joeyh.name Sat Jan 19 12:05:13 2013
It'd be awesome to be able to use Rackspace as remote storage as an alternative to S3, I would submit a patch, but know 0 Haskell :D
Comment by Greg Wed Jan 30 07:33:12 2013

Would it be possible to support Rapidshare as a new special remote? They offer unlimited storage for 6-10€ per month. It would be great for larger backups. Their API can be found here: http://images.rapidshare.com/apidoc.txt

Comment by Nico Sat Feb 2 12:49:58 2013

Is there any chance a special remote that functions like a hybrid of 'web' and 'hook'? At least in theory, it should be relatively simple, since it would only support 'get' and the only meaningful parameters to pass would be the URL and the output file name.

Maybe make it something like git config annex.myprogram-webhook 'myprogram $ANNEX_URL $ANNEX_FILE', and fetching could work by adding a --handler or --type parameter to addurl.

The use case here is anywhere that simple 'fetch the file over HTTP/FTP/etc' isn't workable - maybe it's on rapidshare and you need to use plowshare to download it; maybe it's a youtube video and you want to use youtube-dl, maybe it's a chapter of a manga and you want to turn it into a CBZ file when you fetch it.

Comment by Alex Sun Feb 24 11:05:27 2013
A ridiculously cool possibility would be to allow them to match against URLs and then handle those (youtube-dl for youtube video URLs, for instance), but that would be additional work on your end and isn't really necessary.
Comment by Alex Sun Feb 24 11:13:16 2013
It'd be really cool to have Rackspace cloud files support. Like the guy above me said, I would submit a patch but not if I have to learn Haskell first :)
Comment by Ashwin Fri Mar 22 04:20:40 2013
@Alex: You might see if the newly-added wishlist: allow configuration of downloader for addurl could be made to do what you need... I've not played around with it yet, but perhaps you could set the downloader to be something that can sort out the various URLs and send them to the correct downloading tool?
Comment by andy Fri Apr 12 04:54:47 2013
Comments on this page are closed.