git-annex extends git's usual remotes with some special remotes, that are not git repositories. This way you can set up a remote using say, Amazon S3, and use git-annex to transfer files into the cloud.

First, export your Amazon AWS credentials:

# export AWS_ACCESS_KEY_ID="08TJMT99S3511WOZEP91"
# export AWS_SECRET_ACCESS_KEY="s3kr1t"

Now, create a gpg key, if you don't already have one. This will be used to encrypt everything stored in S3, for your privacy. Once you have a gpg key, run gpg --list-secret-keys to look up its key id, something like "2512E3C7"

Next, create the S3 remote, and describe it.

# git annex initremote cloud type=S3 chunk=1MiB keyid=2512E3C7
initremote cloud (encryption setup with gpg key C910D9222512E3C7) (checking bucket) (creating bucket in US) (gpg) ok
# git annex describe cloud "at Amazon's US datacenter"
describe cloud ok

The configuration for the S3 remote is stored in git. So to make another repository use the same S3 remote is easy:

# cd /media/usb/annex
# git pull laptop
# git annex initremote cloud
initremote cloud (gpg) (checking bucket) ok

Now the remote can be used like any other remote.

# git annex copy my_cool_big_file --to cloud
copy my_cool_big_file (gpg) (checking cloud...) (to cloud...) ok
# git annex move video/hackity_hack_and_kaxxt.mov --to cloud
move video/hackity_hack_and_kaxxt.mov (checking cloud...) (to cloud...) ok

See S3 for details.

The instructions state ANNEX_S3_ACCESS_KEY_ID and ANNEX_SECRET_ACCESS_KEY but git-annex cannot connect with those constants. git-annex tells me to set both "AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY" instead, which works. This is with Xubuntu 12.04.
Thanks, I've fixed that. (You could have too.. this is a wiki ;)
Comment by http://joeyh.name/ Tue May 29 15:10:42 2012
If I revoke old AWS credentials and create new ones, how would I inform git-annex of the change to AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY?
Comment by annexuser Tue Apr 15 17:59:43 2014

You can use git annex enableremote to change an existing remote's configuration. So this should work:

# export AWS_ACCESS_KEY_ID="newRANDOMGOBBLDEYGOOK"
# export AWS_SECRET_ACCESS_KEY="news3kr1t"
# git annex enableremote cloud
Comment by http://joeyh.name/ Thu Apr 17 15:44:55 2014
Once use case for GIT with Amazon S3 is to maintain a web site on S3 you can easily update from a local machine. In that case you would not want to encrypt. Is encryption optional? This isn't clear from the instructions.

Jack, if you don't want to use encryption you can use encryption=none as documented here.

I'm not sure exactly what you're trying to do, but please note that you files won't be easily available on S3: they will be named as git-annex keys, with long and unreadable names such as "SHA256E-s6311--c7533fdd259d872793b7298cbb56a1912e80c52a845661b0b9ff391c65ee2abc.html" instead of "index.html".

Comment by http://schnouki.net/ Tue Sep 9 08:48:59 2014

I don't know if this is what Jack wanted, but you can upload your files to S3 and let them be accessible through a public URL.

First, go to (or create) the bucket you will use at S3 and add a public get policy to it:

   {
        "Version": "2008-10-17",
        "Statement": [
            {
                "Sid": "AllowPublicRead",
                "Effect": "Allow",
                "Principal": {
                    "AWS": "*"
                },
                "Action": "s3:GetObject",
                "Resource": "arn:aws:s3:::BUCKETNAME/*"
            }
        ]
    }

Then set up your special remote with the options encryption=none, bucket='BUCKETNAME' chunk=0 (and any others you want).

Your files will be accessible through http://BUCKETNAME.s3-website-LOCATION.amazonaws.com/KEY where location is the one specified through the options datacenter and KEY is the SHA-SOMETHING hash of the file, created by git annex and accessible if you run git annex lookupkey FILEPATH.

This way you can share a link to each file you have at your S3 remote.

Comments on this page are closed.