In an effort to replace my home backup server with Amazon's S3, I've been collecting a list of Amazon S3 compatible backup tools to look at. Here's what I've discovered, followed by my requirements.

The List

I've evaluated exactly zero of these so far. That's next.

  • s3sync.rb is written in Ruby as a sort of rsync clone to replace the perl script s3sync which is now abandonware. Given that I already use rsync for much of my backup system, this is highly appealing.
  • Backup Manager appears to now have S3 support as of version 0.7.3. It's a command-line tool for Linux (and likely other Unix-like systems).
  • s3DAV isn't exactly a backup tool. It's provides a WebDAV front-end (or "virtual filesystem") to S3 storage, so you could use many other backup tools with S3. Recent versions of Windows and Mac OS have WebDAV support built-in. Java is required for s3DAV.
  • S3 Backup is an Open Source tool for backing up to S3. It's currently available only for Windows. Mac and Linux versions appear to be planned. The UI is built on wxWidgets.
  • duplicity is a free Unix tool that uses S3 and the librsync library. It is written in Python but not considered suitable for backing up important data quite yet.
  • S3 Solutions is a list of other S3 related tools on the Amazon Developer Connection.
  • Brackup is a backup tool written by Brad Fitzpatrick (of LiveJournal, SixApart, memcached, perlbal, etc...). It's written in Perl, fairly new, and doesn't have a lot in the way of documentation yet.
  • Jungle Disk provides clients for Mac, Windows, and Linux. It also offers a local WebDAV server.
  • DragonDisk has Linux and Windows clients.

For those keeping track, non-S3 options suggested in the comment on my previous post are Carbonite,, and a DreamHost account.

Are there other S3 tools that I'm missing?

Also, I've found that Amazon's S3 forum is quite helpful. The discussion there is generally of good quality and the software does the job nicely. Perhaps we should do something similar for YDN instead of using Yahoo! Groups?

My Requirements

Most of what I need to backup lives on Linux servers in a few collocation facilities around the country (Bowling Green, Ohio; San Jose, California; San Francisco, CA). My laptop and desktop windows boxes have USB backup and get automatically synced to a Unix box on a regular basis already using the excellent SyncBack SE, so I don't need to re-solve that problem.

I don't really need a fancy GUI. I'm really looking for a stand alone tool that's designed to work with S3 and keep bandwidth usage to a minimum. Alternatively, something that works at a lower level (such as a filesystem driver) to provide a "virtual drive" type of interface might work as well.

Posted by jzawodn at October 06, 2006 03:09 PM

Reader Comments
# Eric said:

I'm using BitBucket in python. Works for me, and reliable enough in my experience of dumping 1 gig (and smaller) chunks up. I'm using a variation on the basic sync example, but I'm gpg encrypted tarball snapshots, not individual files.

on October 6, 2006 03:23 PM
# Chris Nokleberg said:

There is this project which aims to treat S3 as an infinitely large disk (Linux only?):

on October 6, 2006 03:38 PM
# Nate Kartchner said:

You ought to try out Mozy as well.

on October 6, 2006 03:45 PM
# Brad Fitzpatrick said:

Latest brackup started to have more docs:

But yeah, a little weak. Has a nice test suite and clear source code, though. :-)

on October 6, 2006 03:56 PM
# David Magda said:

How much do you trust the hosting service? Even if you have "nothing to hide", is it important that your data be encrypted or not? I'm simply mentioning because this is more important to some people than others (especially if you want to backup Quicken files or some such).

on October 6, 2006 04:11 PM
# Craig Hughes said:

I’ve been thinking of using S3 to store backups of various machines (basically all linux/OSX ones), but what’s been holding me back is the inability of S3 to do rsync on the server side. rsync really needs an instance of rsync running “near” where the data is stored in order to do its cleverest compression/do-not-transmit smarts. Rsync is basically a win if you have a high-bandwidth link between the rsync server and the backing store, and a lower bandwidth link between the rsync server and client. With S3, you’d have to run the rsync server side yourself, remote from S3, which kind of defeats the purpose of rsync… But then I had a brainstorm.

Amazon’s ECC service, which parallels S3, allows you to create a virtual machine and turn it on/off as needed in the amazon compute cloud. The ECC instances have high-bandwidth connectivity to S3 storage, and so would be ideal for running an rsync server! You can set up an ECC instance which serves rsync, and then your backup script can turn the instance on, do the rsync, then shut the instance down when it’s done.

Now all I have to do is actually create the ECC instance, then create some kind of wrapper around the whole thing which does the startup-backup-shutdown wrapping around the ECC API, and voila!

on October 6, 2006 04:22 PM
# Jeremy Zawodny said:


Brillian! I really like this idea...

on October 6, 2006 04:28 PM
# Kevin Marsh said:

How much upstream bandwidth do you have to make this satisfactory? My measly 512K upstream makes backing up online sound like a drag (though I do do appreciate the reliability and general peace-of-mind).

on October 6, 2006 05:35 PM
# Jeremy Zawodny said:

As much as I'm willing to pay for. Again, I'm mainly backing up data that's already hosted in a data center somewhere.

on October 6, 2006 05:50 PM
# Jason said:

I'm using s3sync for my newly-minted nightly backups to S3, and I have to say I like it a lot. I had some issues with version 1.0.1, but had a series of nearly real-time emails with the developer, was able to provide some tcpdump records of the problems, and version 1.0.2 came out soon thereafter that hasn't hiccupped once for me. I'm pushing MySQL database dumps up to S3 nightly with it, some of which are on the order of 30-50 Mb, and everything works great.

The other tool you don't mention is jets3t:

It's a Java toolkit for using S3, but it also comes with two tools built atop the framework, one a command-line tool (Synchronize) that works flawlessly at moving directories of files up to S3 and back down to your computer (including metadata, and some version of checking to make sure files need to be pushed over the pipe before using the bandwidth). There's also a GUI, Cockpit, that lets you upload and download files as well as manage ACLs. I've now run enough tests with it that if I run into any problems with s3sync, I feel totally secure I could move over to it without a problem.

on October 6, 2006 07:04 PM
# Jason said:

Oh, one more thing: I tried to get Backup Manager working on my Linux box, but couldn't get it to actually communicate with S3. Between it needing a bunch of Perl CPAN modules installed (without any documentation to that effect, meaning you have to watch for errors, decode which package is missing, go find it and hope it'll compile against your specific distro, etc.) and there not being any real way for me to debug what was going wrong with its attempt to talk to S3, it was an easy call for me to move on to other tools.

on October 6, 2006 07:08 PM
# Thom Davis said:

What Exactly Is Jeremy Zawodny's Job?
I have no idea. Neither does Jeremy.

on October 6, 2006 08:17 PM
# E. David Zotter said:

you forgot this one.....

E. David Zotter

on October 6, 2006 09:01 PM
# Jeremy Zawodny said:


Huh? What does that have to do with any of this?

Does everything that I write on my web site have to be related to my job?

What are you getting at, anyway? Come on. Out with it!

on October 6, 2006 09:18 PM
# Thom Davis said:

(re: linkblog)

Well, what exactly is your official job title Jeremy? What exactly do you do everyday? I saw in a previous entry that you pretty much just answer emails. Come on. Out with it!

on October 6, 2006 11:08 PM
# Paul Stamatiou said:

JungleDisk is exactly what I was looking for. I went ahead and just signed up for S3. Up until now S3 seemed very developer oriented and wouldn't hold up well for the person looking to backup various files/share them amongst computers. Now all I have to work on is getting a faster upload, as Comcast has something against us with a paltry 100kbit/s upload.

on October 7, 2006 12:06 AM
# Alex Mace said:

Interesting little tidbit on that Jeep on Water video (from the BBC's TopGear) - it's presenter, Richard Hammond recently crashed a jet car at 300mph and is currently in hospital. See
He suffered a serious brain injury and is probably only still with us due to the quick action of the Yorkshire Air Ambulance. Air Ambulances over here in the UK are completely supported by charity donations, and a page was set up after the accident to take donations for the Yorkshire Air Ambulance, which is here -

General opinion now is that Richard is going to be ok (, but I think you can put that solely down to the Air Ambulance saving his life.

on October 7, 2006 01:17 AM
# aspirin said:

good job.

on October 7, 2006 06:00 AM
# gymbrall said:

For those of you who are interested in trying out S3 without it costing you anything, or if you just want to test your homebrewed S3 backup suite, you might want to check out Park Place. It's an S3 clone that you can run on your own computer.

on October 7, 2006 06:51 AM
# gymbrall said:

And for whatever reason, the URL didn't come through (m aybe I should't have tried putting it inside an <a> tag...
Anyway, here's where you can find out more about Park Place.

on October 7, 2006 06:53 AM
# Pete Prodoehl said:

It may not fit your requirements, but I'm evaluating Interarchy (Mac OS X) for use with S3. It treats S3 like any other remote server, and offers some nice automation features.

on October 7, 2006 07:21 AM
# John Eberly said:

I have been thinking about offsite backups for awhile now. I posted a little bit about my thought process on my blog.

For now, I have decided to go with s3sync. I just finished a test yesterday backing up and restoring 2GB of photos. It worked great and I am now planning to document the process in detail when I implement it on my linux server to backup up 15GB-20GB of my important documents.

I actually upgraded my Comcast internet speed from 6 Mbps/384 Kbps up to 8 Mbps down/768 Kbps up, to double the upload speed ($10 more/month). In my test it took 3hr/GB upload, and about 17 minutes/GB download. Certainly it will take a long time for the initial upload, but it will only be the changed files after that.

I was quickly reminded I haven't enabled QoS on my router yet. My girlfriend called and asked why the "server was so slow" while she was trying to work with files on my ssh server. Of course I could barely hear her with my VoIP phone.

Comcast note: I overheard my barber talking about calling up comcast after seeing a TV special for TV pricing and they updated his existing account with the promotional pricing. I thought that can't be true, so I tried it last month with my internet connection and they gave me the latest special $33/month.

on October 7, 2006 07:47 AM
# John Eberly said:

Here is a link to my thought process about different offsite backup solutions.
not trying to spam, just thought it might help people.

By the way, is S3 backup really "open source"? I tried the Beta version which has an expiration date.

on October 7, 2006 07:58 AM
# bfly said:

I am currently using a combination of BackupNinja (using the Duplicity module), Jungledisk and DAVFS2. A quite extensive list of programs, but it works perfectly.

BackupNinja is used to make dumps of the databases, make a backup using Duplicity, for the scheduling of the backups.

JungleDisk provides webdav access to the S3 space, and DAVFS mounts the webdav as a normal directory to which BackupNinja writes the backups.

Works like a charm, much more stable than using Duplicity's S3 access features. Moreover JungleDisk supports caching, so no problem if the internet connection is lost during the backup!

on October 7, 2006 08:18 AM
# Morten said:

Thom, EC2+S3 would be a killer app. If only an S3 account could be mounted as a device. I know that there has been some work in doing that (see the S3 forums) but there's nothing stable yet.

on October 7, 2006 08:26 AM
# Jtwilkins said:

Thank for the info, great post.

on October 7, 2006 08:42 AM
# Dimitri said:

I use jungledisk with SuperSync to access and backup multiple music collections. Running supersync on an ecc system would be sweet, but just with the jungledisk cache, its pretty sweet.

on October 7, 2006 09:00 AM
# vishi said:
on October 7, 2006 09:37 AM
# supersaurus said:

I couldn't tell if you are a developer or not, but if you are there is one more route: roll your own using the free library code amazon supplies. I did this as a "get to know ruby" project, and it isn't very difficult with the supplied code as a base.

something to keep in mind: s3 *isn't* a file system, it is a key/blob system where the keys can be almost anything and the blob can be anything as long as the size is less than 5G. want it encrypted? do it before you send it. compression? same. metadata? up to 4k anything you want using 'x-amz-meta-???' in the headers when you put the data. puts are theoretically all or nothing (there is no way to seek in a blob, so there is no way to provide true rsync-like functionality...if the blob changes you have to reput the whole thing). you can look at the md5 sum after a put so you can compare without resending all. there is no key rename and there is nothing equivalent to a hardlink (multiple keys pointing to one blob), so rolling .0, .1, etc style snapshot backups aren't convenient (because re-putting the data costs you both time and money). certain operations may require multiple gets, e.g. listing the contents of a bucket will at most give back 1000 keys per get, so you may need to iterate. there is no equivalent to 'rm -fr *' either, you must delete the contents of a bucket one at at time. the upload speed is limited by your ISP, so don't be shocked if it is 35kbytes/sec (100 hours for 12G).

last, the simple way to put doesn't stream the contents, meaning it's ok for 3M jpegs, but if you want to put a 3G binary you'd better have a *lot* of memory. I think I recall from the forums that the s3sync author has figured out how to make streaming work, but I haven't looked at the code. you can ask him in the forum. when you are evaling tools you'll definitely want to check for streaming.

given what you want I'd look at s3sync first. I wouldn't have written something myself if it had been available at the time. I doubt you'd be happy with the ones that make s3 look like a disk in explorer because you can't automate using them. ruby/perl/python etc solutions are nice because it is at least possible to make them portable.

on October 7, 2006 01:57 PM
# Sergey said:

gymbrall said:
"For those of you who are interested in trying out S3 without it costing you anything..."

Come on! Only if your own time is free. Installing an S3 sim is work. Whatever your hourly rate is, it's not pennies a month like S3.

on October 7, 2006 03:10 PM
# Sergey said:

S3 Backup got upgraded yet again:

on October 7, 2006 04:02 PM
# xxdesmus said:

I just signed up for S3 yesterday after doing some extensive research while I've been looking for an offsite backup solution. I looked at Carbonite and I do like it quite a lot, but S3 has much better prices.

Thank you for this post, this is literally EXACTLY what I was looking for.

on October 7, 2006 08:05 PM
# grumpY! said:

jeremy, you say you store your data on three different hosting sites (ohio, san jose, sf). isn't this your "backup"? ohio goes kaput, copy from sf or sj, etc.

on October 7, 2006 09:58 PM
# Jeremy Zawodny said:

Most of these boxes are 3-4 years old and don't have the space I'd need to do that. The newest of them has ~120GB of RAID, so it can backup the others. But they cannot back it up.

on October 7, 2006 10:55 PM
# Linh said:

I must say, s3 is very intriguing. I'm still not sure on it, mainly because cabonite and mozy exist. I'm really hoping a mac client comes out or else I'm sorta stuck w/ s3.

Pricing wise, carbonite wins because I'd back all my media up too.. which gets me into 40-60GB range. Mozy would be next because you can do block level changes, and have a private key. Mozy has a better interface I believe too IMO. If I were keeping my windows box, I'd go with them.

Jungledisk is interesting... I was almost sold until I read it stores your key in plain text on your computer. The jets3t client looks interesting as well, and s3 backup when it goes multiplatform.

I'm too drawn to carbonites "unlimited storage" but I know I run risks if I dump 200GB of media on there. I also cannot understand how they plan on staying in business.

on October 8, 2006 05:13 AM
# Denis said:

I started using Duplicity recently - it lets you do incremental (using librsync) encrypted (using gpg) backups (using tar) that can be stored remotely (using ssh/scp, ftp or supposely S3 as well).

It is part of Fedora/Debian which I suppose gives it some non-abandonware-ness :)

on October 8, 2006 08:03 AM
# Justin Mason said:

someone in the previous post's comments mentioned Dreamhost offering 200GB / 2TB-per-month for $90 for 2 years... so that's what I've gone with. ;) An excellent deal.

on October 8, 2006 02:59 PM
# Dave said:

Works like a dream. I've linked to you, written it up, and am spreading the word. Thanks.

on October 8, 2006 08:10 PM
# nainish said:

This is a great blog and admittedly is aimed at those who's backup needs are stricter than mine.

All I want is a backup system for my key files and photos. Music not so important. I have a Mac and PC.

So, I'm planning to use iDisk of .mac to backup my photos on my Mac and the XDrive on AOL for my other things on the PC.

on October 9, 2006 03:38 AM
# xxdesmus said:


I am in the same exact boat. I like the simplicity of S3, I really do, but I like the automation of Carbonite. Perhaps once the application front-ends for S3 have matured a bit I may make the full switch over, but for right now I think Carbonite is more what I am looking for.

As for the space consideration, I am looking to only backup around 10GB or less...unless I decide to backup my MP3's but then we're talking about needing 80GB of storage which I do not really trust any other company to hold on to.

I still wonder how Carbonite plans to stay in business. If I had to guess I'd say they are using S3 as their backend. There is no real other way they could offer the prices they do. I suppose I will stick with Carbonite for now, if (god forbid) they fail miserably then I will make the full switch over to S3, but I will also keep an eye on the development of some of the S3 backup programs.

This is a great blog, and it has a great group of commentors. Keep up the good work guys!

on October 9, 2006 07:08 AM
# Wayne said:

On the questions about how Carbonite will stay in business, just think about it a bit. They surely use S3 or can beat S3's prices internally.

They allow you to upload the first 50GB's unthrottled, but after that you are limited to 0.5 GB/day. ($7.50 /month at S3's prices) Plus they reserve the right to kick "abusers" of the network. (They define abuse.) So the maximum loss for power users is pretty limited.

Then consider that most of their users will only have a small amount of data that rarely changes and it does seem like on average they will make a profit. They just need to be so simple it is the solution you recommend to your mother. (A goal they meet pretty well.)

on October 9, 2006 07:56 AM
# xxdesmus said:

Just FYI Guys,

S3 Backup (beta 6) was released today.

on October 9, 2006 10:29 AM
# Joseph Bruno said:

Whatever backup solution you use, it's vital that you try the following test:

1. Install it.
2. Do a backup.
3. Uninstall it.
4. Restore from the backup without it.

If you can't do this, then you will get instant bitrot when the backup program's manufacturer gets bored or goes bust. Or upgrades the format: all my Backup MyPC backups became suddenly unreadable when a new version was released without backwards compatibility!

on October 9, 2006 02:18 PM
# supersaurus said:

assuming s3 itself survives, the keys to passing JB's test above are the format of the keys (I used the relative path as the key, e.g. "photos/2006/09.06.06/xyz.jpg", which is quite easy to figure out) and whether you can figure out the metadata (e.g. I keep the timestamp of the original file in the metadata as seconds since the epoch). if the key scheme isn't decipherable then you are dead meat without the tool. if you can't figure out the metadata and you need things like timestamp, links, etc, then you are not going to get what you want without the tool. if the tool is a script then you'll have the code, meaning in theory you can figure out what scheme the tools uses. even if you *can* dope it out don't forget applying a complicated scheme manually to possibly tens of thousands of files isn't going to happen.

on October 9, 2006 03:28 PM
# Sergey said:

I agree with Joseph Bruno, this is important.

I know JungleDisk uses some really weird naming scheme, I didn't figure it out in limited time I spent trying.

S3 Backup uses straight keys (the way supersaurus suggested), if the file is compressed it has a meta field, returned as a HTTP header X-amz-meta-compression: zlib (or bz2 depending on the algo used). If encrypted it has a similar header X-amz-meta-cipher: AES or Blowfish. Figuring out the key to decrypt is not that simple however. For maximum security the password used for encryption is being 'salted' with a file name and a random string, different every time. The salt is stored as X-amz-meta-key-digest-salt, the SHA hash of the salted string is stored as X-amz-meta-key-digest (we need this to verify the password before downloading file and trying to decrypt it). Also there's X-amz-meta-decrypted-size that tells where to truncate the decrypted data (block ciphers need data to be aligned to a block boundary, it's funny how the anonymous guy who develops jungledisk took the easy way out -- he just used the not-so-secure RC4 cipher just because it doesn't require aligning).

I'll make sure to release a console tool compatible with S3 Backup, so you can be safe that your data are your data. BTW, you can make it happen sooner by voting -- see the last post in the blog (too many links to S3 Backup already, so I wont spam with more).

on October 9, 2006 10:36 PM
# Sergey said:

Oh, btw, I already have introduced incopatible changes to S3 Backup (in hash database format), but I took precautions so both old and new formats can be read, but the app just uses the new, better format when writing. That's it -- fully transparent but still evolving.

on October 9, 2006 10:41 PM
# zhesto said:

seems the list of backup services are pretty popular these days. Another similar post:

And here is the list of more than 150 similar services:

Some recomendation? Strongspace seems pretty good to me (but a little expencive for my taste):

on October 9, 2006 11:51 PM
# Sergey said:

zhesto, general online file storage is VERY different from backup. Not to say that list is not useful, but those are not "similar services" at all.

on October 10, 2006 01:41 AM
# zhesto said:

Sergey, like you see somebody already complained in the comments about mixing different kind of services. Still there are several backup solutions inside the list (s3 based like jungledisk or not, like carbonite).

on October 10, 2006 02:22 AM
# Eric D. Burdo said:

Jeremy, I just found this link from Scoble.

Might be another one to investigate? They serve up content for Digg, TWiT and others.

on October 10, 2006 06:23 AM
# John Eberly said:

Jeremy, thanks again for starting this topic. It finally prodded me to implement the offsite backups I had been wanting to for awhile.

I think the idea of using ECC with S3 to perform backups is very interesting, but I decided to go with a simple solution for now.

I posted how I automated my backups to S3 using s3sync here..

I have successfully uploaded and restored over 30000 objects (>4GB) with this method. I plan on backing up over 10GB starting this weekend. Once the initial upload is done, I estimate my monthly bill will be $2 or so.

on October 10, 2006 10:08 AM
# xxdesmus said:

Just another FYI guys,

For anyone who uses Firefox this is a nice extension,

Basically looks/works just like a FTP client :)

on October 10, 2006 08:06 PM
# Omar said:

Jeremy -- first, thanks for an excellent post! It's been keeping me busy. :) I had a quick comment: in the blog post your item for S3 Backup says that it's "an Open Source tool for backing up to S3". I didn't see any info on the S3 Backup site that it's Open Source.

Sergey, can you confirm or deny either way if the project is Open Source?

Thanks very much!

Thanks very much for

on October 13, 2006 08:43 AM
# Sergey said:

Omar, the app is "partially" open-source, not free in a FSF meaning for sure. I've been quite busy with the app itself, but I do plan to release a lot of underlying code with GPL or maybe MIT license. Another possibility is that I will release full source code with a restrictive license -- not for use in other projects, only for inspection. I don't mind making the application fully open-source if I was compensated for all the time I spent working on it, but I don't think it will happen any other way than that I will monetize it myself. BTW I wonder what Jungle Dave is going to do, JungleDisk has a massive install base already and as the app is closed source, I guess he has plans similar to mine.

In other news Mac OS X and Linux ports of S3 Backup are coming soon, maybe the next beta will span all platforms.

on October 14, 2006 10:28 AM
# Omar said:

Thanks for the info, Sergey! I'm looking forward to using the next beta version. :-)

on October 14, 2006 11:25 AM
# Sergey said:

Quick update on state of S3 Backup on Mac:

on October 15, 2006 09:12 AM
# Bill said:

A new entry is Openfount's InfoMirror. It has a sophisticated Ajax GUI as well as a part that runs in the background.

on October 15, 2006 11:58 PM
# John Smythe said:

Use SyncBackSE (love it for local backups) but event its backup to FTP did not cover my needs (unattended, encrypted, and secure).

Signed up for S3, tried most of the backup clients listed, stuck with JungleDisk, but was very disappointed with the upload speed. I know, it mostly depends on your up-stream, but it was painfully slow.

Came upon Mozy... and never looked back. Good interface, smart features, set it and forget it. Non-nonsense people behind the company which helps a lot. Check it out if you're on a PC.

(Shameful link/plug: i.e. I get free storage if you click this link as opposed to retyping it )

on October 19, 2006 10:08 AM
# Daryn said:

Not really what you've been looking for, but saw this on CrunchGear today, and it's kinda cool..

on October 23, 2006 10:42 PM
# Giles said:


Thank you - and thanks to your commenters - for this list of tools. You pointed me in the direction of the s3sync Ruby script, which I now have running on my NSLU2 (a $100 NAS device from Linksys) - first successful sync today! - and this post helped a lot.

(Here’s the first of the blog posts I made as I went along, in case you or any of your readers are interested:



on November 14, 2006 04:36 AM
# said:

Just another solution: S3Drive

Basically looks/works just like a Windows network drive and therefore accessable in all Windows and DOS application and all programming languages that support file I/O.

on November 15, 2006 12:27 PM
# Andy said:

I have been using iFolder.

It is designed more for 'file sharing' than pure backup, but I use it for both.

on November 28, 2006 11:03 AM
# Ben Strackany said:

What about s3backup ? Ruby-based executables ...

on November 30, 2006 04:33 PM
# Joe Drumgoole said:


We are about a week away from launching a private beta of PutPlace 1.0. This will initially offer automated backup to S3 and an explorer style view of your content once its up there with the ability to retrieve at will. Registration for the beta is available at at

Windows only at the moment with Mac in the works.

on December 14, 2006 12:01 PM
# Shane said:

Hi Jeremy. Here is another tool to add to your list.
It's a very simple bridge between TAR and S3.


on January 11, 2007 07:56 AM
# Andy said:

I tried jungle disk but it wouldn't do a scheduled backup - would do an on-demand one, but not scheduled - therefore pretty useless for me :-(

on January 28, 2007 07:29 PM
# David Pascoe said:

I have been using Super Flexible File Synchronizer for quite a while now to sync documents and photos between my laptop, desktop, and do ad-hoc backups to a second desktop drive and a USB HD.

It really is super flexible (well maybe too flexible, given the gui layout in some parts)

The latest version now supports S3 which I'm trying out. The program will run as a windows service, which is the sort of set-and-forget operation that I'd really like. So far so good.

Most tools I've looked at are simple and feel like a beta. SFFS goes beyond just a simple interface.
30 day trial
windows only.

One little trick I've learnt is that Bucket names must be unique across all of S3 - so choose something obscure !

on February 18, 2007 04:54 AM
# Jan said:

You can use a website to manage your Amazon S3 account:

on February 20, 2007 04:46 AM
# Alex said:

Hello, a few months ago i started a similar list (inspired by this one). I'm trying to keep it up to date. If interested go to

on February 27, 2007 01:30 AM
# said:

You can also check out S3-FTP at It implemnets an FTP server that stores to S3.

on March 13, 2007 10:42 AM
# Jeff B said:

I run the s3fuse on a virtual server, However, I use dump/restore, and backup on an inode basis. I do a level 0 every 3-4 weeks and o tdhe tower of hanoi method of diffs. This keeps transfers very low, as well as disk usage. rsync is great, but it is just a copy of the data as opposed to incremental backups. Sometimes people (myself included) want to recover something that was intentionally deleted a day or week ago.

I tried duplicity but went back to dump/restore since almost everything duplicity tried to do was just a complicated way to do the similar thing dump/restore already does, but didn't even bother doing it on an inode level, but only on a file level. duplicity, at least the last time I tried, did not allow scripted gpg encryption. It required the gpg encryption to be "signed", and who in their right mind would leave their secret key around with no password, just so duplicity could use it to sign+encrypt. encrypt would have been good enough, but it was not possible.

gpg was no problem with dump/restore because I just gpg -e (ed) the stdout of dump before writing it do it's destination. Scriptable. Incremental.

on March 21, 2007 11:18 PM
# said:

Download using S3Fox doesn't seem to work on OS X. Looks like S3Fox is using backslashes where slashes are needed in destination paths for objects -- resulting in empty folders on the OS X side.

on March 27, 2007 03:31 PM
# Offshore-Outsourcing said:

Thanks for the info, Sergey!i was looking forward for this..

on April 18, 2007 01:28 AM
# said:

Thanks for the info...

on April 18, 2007 01:30 AM
# Greg said:

You might want to check out Quillen. It is a new project that uses a novel approach to minimize data transfer and storage on S3. It also strives to be simple with just a command line interface.

on June 2, 2007 01:28 PM
# Iliana said:

Backup Review:

I found the best solution to backup your computer online. Mozy offers great customer service and wonderful services. You can backup your data for FREE for up to 2GB or you can purchase a UNLIMITED account for only $4.95 a month Nate was correct you should all try:

I personally recommend them AAA+++


on July 10, 2007 12:58 PM
# mrshiney said:

I used mozy for a while and liked it until I switched to ubuntu and realized they don't currently support Linux.

I'm going to investigate some S3 solutions. I'm guessing that if I setup encfs I can back up the encrypted folder(s) for an additional level of security.

Btw if you do utilize mozy, they do offer private key encryption but files are encrypted on a per-each basis. You have to manually decrypt each file after a restore.

on July 15, 2007 06:32 PM
# Saurabh said:

We released a User Interface today, which is called Bucket Explorer. Technically, its probably the best solution out there to transfer files to S3 and when you want to be in control, as it has easy to use UI, which is built on top of the robust JetS3t API.

You can use Bucket Explorer as a simple FTP tool or a backup tool for Amazon S3 or you can use it to:

1. Browse buckets and the files stored at Amazon S3.
2. Upload and download files, to and from Amazon S3 buckets.
3. Set Server Access Logging (Bucket Logging) for Audits.
Synchronize data between multiple computers.
4. Upload files in HTML format for Web Hosting (even when the extension is not .html or .htm).
5. Create public URLs and signed URLs to share the files.
Access shared buckets and files from someone else's account.
6. Set Access Control on Buckets and Files for authorizing other Amazon users or non authenticated users with different access rights.

Bucket Explorer works on every OS where Java is supported. It uses Amazon's ETag and its own SHA-1 hash combination to make sure that a file is never transferred again to Amazon S3 if is it not changed, to save the bandwidth costs and time.

on August 14, 2007 07:27 PM
# said:

just an FYI I've posted "s3fs" (no relation to the original projected named "s3fs"), a fuse based filesystem backed by amazon s3, on google code

on September 23, 2007 09:41 AM
# said:

We use S3 at our startup for our embed tool. Thanks for pointing me to all these useful tools.

on October 2, 2007 03:02 PM
# Alex Bell said:

I had no idea how important it was to post content as HTML.I liked your blog, it was fun to read. I look for blogs all the time that can influence me to come up with unique ideas for my blog.


on October 10, 2007 01:25 AM
# Ade Atobatele said:

I've just come across S3, read most of the blog but didn't get the answer that I was looking for :)

Wonder if you know?

Are the any products that will run process on a linux box that will sync the data on the box with Amazon S3, based on a cron job?

on October 20, 2007 11:16 AM
# neurophyre said:

WARNING. The JetS3t toolkit currently defaults to very weak (single 56-bit DES) encryption. Anybody considering using it needs to look into changing the algorithm it uses to PBEWithMD5AndTripleDES at least until they hopefully implement my bug report here:

on October 26, 2007 06:12 PM
# Mark Fitzpatrick said:

Looked quickly at Backkup Manager and saw that there is no S3 restore capability. Meaning "they will help you check your bits in but not out".

on November 1, 2007 01:46 PM
# Carsten Cumbrowski said:

Here are a few more tools for S3.

S3Drive plugs into the Windows file system and maps a S3 folder as local drive. It can be accessed like a network drive (\\servername\folder where "servername" is the name of your S3 account your specify in the S3Drive config for it). S3Drive is free. It creates a virtual file system structure in XML format. That means that the folder and file structure you see in Windows is not identical with the structure and names used for the resources on your S3 storage area. You can access the XML files via other tools, such as S3Fox etc.

S3 Plugin for Wordpress

Flickr to S3 Backup Tool


on November 17, 2007 02:25 PM
# slb said:

check out 'Transmit' at so far so good, on the mac platform.

on January 2, 2008 12:34 PM
# GP said:

It might be interesting for you to consider this (Free) next generation backup product (Secobackup, S3SQL, Its basically a "install and forget" type of product.

Secobackup is a personal backup product - it provides continuous data protection - as you make changes and save them, they are automatically backed up into Amazon S3. One nice thing about this is that it uses deduplication internally. So if 3 of your PCs have the same JPEG picture stored on their disks, it will detect that they are duplicates and store only once.

S3SQL is a automated MySQL backup product utilizing Amazon S3 similarly. You set up scheduled backup of one or more of your MySQL databases, you point the backup to your Amazon account, and it will automatically do regular backups, compress them, encrypt them, do differential and de-duplication type optizations and store them on Amazon's S3 service. Its has a AJAX based browser UI. Install it on a Windows XP box and you can set up backup of MySQL databases from any number of hosts - Linux, Solaris, Windows etc.

Its free. Try it, our users typically are up and running in just a few minutes with the backup already going to Amazon S3. Its simple to setup, and gets you immediate peace of mind from disk crashes, leaky roof or whatever...


on January 4, 2008 09:00 PM
# Marc Liron said:


Well our own little Windows and Mac OSX - Amazon S3 software is 3 months old today!


Marc Liron
Microsoft MVP

on January 5, 2008 12:38 PM
# The Pageman said:


thanks for the post.
I have a requirement that needs to back-up 1 Terabyte of Data FOREVER - basically, it's an archive of data. Once it's uploaded, it stays the same size indefinitely. Looking at your solutions, I think it would be better for me just to get 2 DreamHost accounts (2 500GB accounts) or maybe Carbonite or s3. I could be wrong. Any suggestions?

on January 17, 2008 09:00 AM
# rmccarley said:

For business use or if you have multiple computers, servers or alternative operating systems you should look at Also good if you need to back up several GBs instead of like... 5.

on January 17, 2008 02:42 PM
# Alex said:

Cheers, very nice list. Needed some ideas for a couple of sites and found a gem to base on thanks to this list.

on February 5, 2008 05:42 AM
# Shawn Fumo said:

Pageman, please keep in mind that using a DreamHost account as backup is against their TOS. There was some controversy about this a while back when they explicitly said you shouldn't do it.

They probably won't do anything if you just host a couple of files, but they said that if the majority of an account was non-web accessible, that that would be considered abusing the service. I think two 500gb accounts just for storage would get you into trouble for sure!

on February 14, 2008 08:11 PM
# Milan Magudia said:

I'm writing a implementation of S3 in PHP (for fun!). It's not exactly a backup, but if any of the tools above can backup to another S3 url then it might be useful. I'm writing it as I don't believe in a single source of failure and it might be useful for mocking / fail over source. It's not close to being finished yet, but given a few more weeks it'll be much closer to being useable.

on February 22, 2008 12:16 PM
# OS X Backup said:

WOW, thanks for this list, I was looking for Mac OS X S3 backup solution. Thank you!

on February 28, 2008 11:31 AM
# Timothy Lee Russell said:

I started some C# source awhile ago for accessing S3 from PowerShell.

S3Nas PowerShell Provider:


on February 29, 2008 01:43 PM
# Torley said:

I like S3Fox on Windows and Mac — it works reasonably well, but still, I wish there was an elegant S3 backup utility for Windows in the same vein as Transmit, which looks pretty spiffy! The Windows ones I've tried so far have disappointed me and been very clunky/awkward to use. :(

on March 23, 2008 08:27 AM
# Jason Kester said:

You might want to check out S3stat while you're at it. It's a service that provides web stats for your S3 buckets:

on May 29, 2008 02:57 AM
# Duivesteyn said:

i've also done something similar, which takes both file data and sql data. suits CPanel / WHM sites better too.

on July 6, 2008 01:37 AM
# Tris said:

Hey All,

I have recently developed "River Drive" (Yes! I know!) Which is a Linux(via Mono) and Windows compatible GUI and command line interface to S3. It supports drag and drop, and can be used to upload multiple files at the same time. It has a few things to be ironed out and I plan to ad a load more features soon, such as a local FTP / DAV interface.

You can grab a download at

I would appreciate any feedback anyone has?
Leave a comment on my blog at


on July 8, 2008 08:34 AM
# Morag Lethelluian said:

I've been using - currently they use Amazon S3 to host their service but are opening up the platform to any S2 user in September of 08 - you get to use your files from iGoogle, iPhone and all sort of other channels.

on July 14, 2008 11:59 AM
# DJ said:

I've been using a tool called s3-bash, I'm not sure if it's endorsed by Amazon but it's been working pretty well for my needs.

A quick guide if anybody is interested:

on August 3, 2008 08:07 PM
# said:

Take a look at Manent: (or go directly to the project page at

on August 4, 2008 04:15 AM
# Michal said:

One other tool is s3cmd from SF's s3tools project:

it is becoming pretty popular as it quickly approaches version 1.0. Supports rsync like backup and restore, works great with non-ascii (e.g. unicode) filenames, and has plenty of other features. Worth giving it a try.

on September 7, 2008 10:23 PM
# Ken Worton said:

I've been trying the beta of a new service called SMEStorage available at These guys have a beta called OpenS3 which sync's your S3 information to their platform so that you can access the data from your Iphone, iGoogle, Facebook, their rich web client etc.

They also then enable your data to be integrated with other services such as Zoho, Picnik, Flickr etc - its pretty slick.

on September 10, 2008 01:34 AM
# Joel Hewitt said:

Try looking at:
it is in beta, but I have been using it with good success for a while now.

on September 12, 2008 12:13 PM
# WebDrive said:

WebDrive maps a drive letter to S3 and it has a simple backup/sync utility built in. It can also map additional drives to servers using other protocols, like SFTP, WebDAV, etc.

on September 17, 2008 08:08 AM
# S3 Browser said:

Here is the freeware GUI client for Amazon S3:

on October 1, 2008 06:48 AM
# Carl Branning said:

I just checked out that some of the other commenters here had blogged about. It allows me to use my S3 account with their service through something they call OpenS3. Now there is no shock their, because I have been using Jungle Disk for a while now to dump files to S3. However I now can use my S3 account through their platform. That means I can now access my S3 files via my iPhone amongst many other channels. I also get a file explorer for Windows that lets me upload/download and share files easily as well as plug-ins for Open Office and Microsoft Word. In short my Amazon S3 experience just became much nicer !

SMEStorage recently started supporting GMail as a cloud storage provider, meaning you can store files on Gmail and access them on their platform exactly like I do my S3 files. And because I can export my S3 files to my GMail account via SMEStorage I have a back up of all my files that is free !

THe only issue I have found is that I'm also using Jungle and it puts some weird information in the file names so it is hard to figure out what the file is outside of Jungle. I pinged the guys from SMEStorage and they said they would have a look at how the could parse this to make an import easier.

on October 9, 2008 10:00 AM
# Hagen said:

Hi Jeremy, hi all readers

i am the head developer of a new tool named "s3ganize", which comes with an explorer-like user interface for amazon s3. The Url of that tool is:

Its currently only available for MS Windows systems. but, as an linux-user, within near future i will deploy a multi-platform-application for giving linux users also access to s3 in a graphical way. Maybe it would be possible for me, to support mac os-x also, but i cant promise this right now.

I would be very glad, if you would add this program to your list of s3-tools.

Thank you very much. If you would list it, i'll give you updates if needed.

Kind regards,

on October 19, 2008 10:34 AM
# said:

I want to store my companies data on Amazon's S3 for multiple users to access, edit, etc. and I'm looking for an application that would allow me to create a backup of the data stored on Amazon's S3 to an offsite location of my choice rather than use S3 as the backup storage.

Any ideas?

on November 4, 2008 07:18 AM
# computer backup said:

wait a sec... you say you want to store your companies info on S3 and then you say you dont want to use S3 as the backup storage; which is it? o_o

on November 5, 2008 01:03 PM
# Mike said:

Because we are located in a major hurricane city on the Gulf Coast the idea is to no longer use our server as the primary location for our data, but to use S3 as the primary location, which multiple users can access and edit at anytime. What my bosses would like is something that will backup the data that is on S3, but maybe to a swappable drive that a member of management could take home as an extra sense of security. So essentially the original data is safely stored and used on S3 and we make backups to an external location. It is very sensitive data that basically runs the business and they want redundancy basically to the point of overkill.

I hope I explained this a little better. Let me know your thoughts.


on November 6, 2008 05:47 AM
# Andy said:

Hi everyone,
want to try another S3 Browser? Check out It is completely FREE! We have released it as a side project from our main S3 based project and we want to share it with the rest of the world! Enjoy!

on November 25, 2008 01:54 AM
# Kevin Bell said:

I found out about here from Morgan - nice, they accepted me on their OpenS3 beta and I've been using their Web file explorer and their windows back-up tools - nice service and free ! One interesting thing you can do is register for Gmail storage account (uses GMail as storage) and then import your Amazon S3 files into GMail as a backup - it obviously won't work for large files but its kind of reassuring to have a cloud backup of vital documents and stuff. Plus it's easy to use normal Gmail tools then to access the files.

on December 6, 2008 10:51 AM
# Scott said:

I've been using dropbox . It uses Amazon S3 for the storage device. It is one of the best sync tools I've found. No hassle. It has worked flawlessly for me since I've been using it.

on December 21, 2008 03:01 PM
# sharp said:

I'm using the solution I describe here:

Its just a "reverse" encfs mount with s3cmd's sync command.

on December 23, 2008 08:44 PM
# said:

Hi, allow you to rsync to your s3 bucket.

on January 2, 2009 11:11 PM
# Keith MAnder said:

I've written my own solution using PHP and MYSQLDUMP to automatically place backups on A3. Read it here:

on January 6, 2009 09:16 AM
# jfileupload said:

You have JS3Upload which is dedicated to upload (with MD5 and resume support):

And JS3Explorer which allows to manage the content of your buckets (delete, upload, download, move, copy, update ACL and more):

on February 10, 2009 08:01 AM
# Ralph Corderoy said: has a good pedigree and looks very promising.

on February 16, 2009 06:08 AM
# Simon_Siber Systems said:

You should also try GoodSync for backup and synchronization to Amazon s3 as well as other types of online storage:

on February 23, 2009 12:32 PM
# Cristiano said:

S3Toolbox is my personal tool I created to move files to Amazon S3. It is free and it works.

on May 4, 2009 04:03 PM
# Offshore Outsourcing said:

Thanks for sharing your nice work.....

on May 11, 2009 04:38 AM
# David Soergel said:

I've written yet another S3 backup script that you may want to check out: It's very easy to use and handles backup rotation, incremental backups, compression, encryption, and MySQL and Subversion dumps. Enjoy!

on June 7, 2009 07:57 PM
# zach said:

I like Gladinet ( Have been using it for over 3 months now. Works well with S3. The bonus point is that I can use Google Docs, Google Picasa, SkyDrive all at the same time side by side with S3 as virtual drive. Periodically I copy my Google Docs files into S3 for backup purposes.

on June 8, 2009 01:32 PM
# Nadya said:

Hi Jeremy, thanks for your post!
Please add to your list a new CloudBerry Online Backup powered by Amazon S3 with friendly user interface, strong data encryption and scheduling capabilities. You can sign-up for beta at

on June 15, 2009 01:16 PM
# Alex said:

You have CrossFTP Pro which is an excellent FTP tool to manage your data on Amazon S3, and do the backups/synchronizations.

on June 19, 2009 08:18 AM
# Joseph said:

Found something very simple that gets the thing done called As3FileSync at

on July 2, 2009 03:59 PM
# Alex said:

Another very useful tool: S3fm, a free online Amazon S3 file manager. 100% Ajax, runs directly from amazon s3, secure and convenient.

on July 7, 2009 10:36 AM
# Lance said:

Does anyone know if there is a way to map an S3 drive/bucket to a windows machine as a network drive?

Also, it's necessary that any files copied to the S3 bucket be set as public read so that they can be accessed from the web.

I have tried WebDrive, but all copies end up as Private and they don't know when they will have a version that can inherit ACLs from the parent bucket.


on July 25, 2009 01:39 PM
# Ville Walveranta said:

David Pascoe mentioned Super Flexible File Synchronizer in February of 2007. I'd like to bring it up again. "SFFS" is rapidly evolving and its S3 support has also received many revisions since its introduction.

The program is very flexible indeed.. so much so that it might take a while for a new user to figure out all it has to offer. That is to say, it is extremely configurable.

Currently available for Windows and Mac. 30-day free trial available at

(I'm not associated with "SFFS" other than being a satisfied user for several years.)

on August 16, 2009 03:57 PM
# Albert said:

We are building amasons3 backup software for home user and enterprice.

Supports block copy and binary paching is on the way

on August 20, 2009 06:44 AM
# said:
on August 21, 2009 07:04 PM
# Backazon Online Backups said:

We offer a backup program that does online backups to the Amazon S3 service ( We offer a 30 day free trial, so please check us out.

(P.S. - I read your posting policy, and I think this comment is on topic and not spammy. My apologies if you disagree.)

on August 28, 2009 07:37 AM
# Kevin said:

There is nothing like "The Best". You have to find one fits your needs.
Something to think when you pick one:
1. Easy to use.
2. Can you get your files back when you need them? most online backup services do well backing up your file. but when you lose your files
and need your files back, you either couldn't do it or it takes long time and lots of time to download. At that time, it is already too late for you to know.
3 Real cost, don't believe in any "unlimited" myth, it is for PR only, First, you pay for each computer and your usage is capped by your drive size. second, your system files are not backed up. You have to have a rough idea how much you will actually backup then calculate on GB permonth cost base. You will find out that "unlimited" is not really cheap in most cases.
4. How long do you think the company can last? most startups can disappear and your files will disappear too. the better way is to backup to some well established online storage like Amazon S3.


on September 4, 2009 10:23 AM
# Michael said:

For those looking for an up to date, easy to use, rather cheap online backup solution I've recently just started using backblaze.
$5/month (per PC), for unlimited storage.

Much cheaper than S3

on September 27, 2009 10:31 AM
# Archie Cobbs said:

s3backer is an open source user-mode block-oriented S3-backed filesystem for UNIX systems. Details here:

on December 10, 2009 09:30 AM
# Dave Wiebe said:

The upload method sometimes feels like the only reliable backup method. I personally use a hard drive, but then, what about a flood or fire, am I protected?

on December 15, 2009 05:36 PM
# online backup said:

That's really a nice list and it's good to know about this.

on December 16, 2009 09:46 PM
# John said:

Here's another service offering offsite backup using open standards (Rsync, SSH, Encfs). No proprietary client required...

The basic plan with 100 GB's of storage is only 2.99 a month.

on January 19, 2010 01:44 PM
# Matter Solutions said:

@archie re: s3backer - thanks

We've got multiple machines and technical expertise and that looks ideal :)

Thanks for the post Jeremy


on January 28, 2010 11:38 PM
# shahriar khan said:

Make and take care of backup using backup resources.....

on February 4, 2010 08:53 AM
# Michael Acain said:

Is an online storage safe?

on February 5, 2010 02:43 PM
# Cherkasov said:

Really these backup tools are great.I have used some of them.But at present i am using Magic Backup online service & really it's great .Magic Backup is so easy to use, and so reliable. Unlike other backup products that perform "scheduled" backups during the middle of the night, Magic Backup is always on the lookout for new or changed files that need to be backed up. The minute you're done editing a document, (well, 10 minutes after actually), Magic Backup will silently prepare and transfer a secure copy of that file to your private location on our servers. You never have to worry about complicated configuration settings, marking files for backup, changing backup tapes, burning backup CDs, or any of that old-school backup mumbo-jumbo.

on February 10, 2010 04:14 AM
# DLT Tapes said:

well, beside all of the comments given above I'd like to say 'good work Jeremy...'

on February 15, 2010 12:18 AM
# Tony said:

I need feedback for my S3 client (Qt based) :


on February 20, 2010 03:10 AM
# BoulderDash said:

Hi Tony,

I discovered the linux version of jamdisk. I like it very much.


on March 11, 2010 08:21 AM
# daboochmeister said:

Amanda/Zmanda recently added S3 support. A bit more commitment to install than the others, but once in place, very effective.

on March 16, 2010 10:20 AM
# KKP said:

Found a (commercial) command line tool for S3 with on-the-fly data encryption, multi-threaded, diff, symlinks,...

using it to create nightly backups of several servers (user files, db-dumps, config files, ...) to S3, data is encrypted at S3 (so none of the amazon guys can get to our user's info ;)), even stores symlink info (symlink-destination metainfo) and restores symlink as such (java 1.7 required for that - but that's not a problem)

on July 4, 2010 01:45 AM
# Blog HTML said:

I like S3Fox on Windows and Mac � it works reasonably well, but still, I wish there was an elegant S3 backup utility for Windows in the same vein as Transmit, which looks pretty spiffy! The Windows ones I've tried so far have disappointed me and been very clunky/awkward to use. :(

on July 8, 2010 03:49 AM
# Amazon Blog said:

I have no issue with paying money for it, but I really care about my back up data that process in the data center.

on July 19, 2010 02:04 PM
# Jacky said:

The cloud has become a lot more powerful with Cloud Storage and Cloud IT Solution 5.0. It is far more than just storage or backup. Not only you can backup files to the cloud, you can also move your entire file server, FTP server, email server, web server and backup system to the cloud. You can create sub-users and sub-groups; you can set different user roles; share different folders to different users with different permissions. For a small business, Cloud-based storage, backup, sharing and Cloud IT Solution can save you a lot of cost, while offering better, more secure and reliable services that can be accessed from anywhere. is one of the first few companies offering such cloud based services. It is now offering the version 5.0 Cloud Storage and Cloud IT Solution. For more info, please visit: DriveHQ basic service is also free.

on August 8, 2010 12:58 PM
Disclaimer: The opinions expressed here are mine and mine alone. My current, past, or previous employers are not responsible for what I write here, the comments left by others, or the photos I may share. If you have questions, please contact me. Also, I am not a journalist or reporter. Don't "pitch" me.


Privacy: I do not share or publish the email addresses or IP addresses of anyone posting a comment here without consent. However, I do reserve the right to remove comments that are spammy, off-topic, or otherwise unsuitable based on my comment policy. In a few cases, I may leave spammy comments but remove any URLs they contain.