hard disk A few weeks ago when I wrote Cheap On-Line Storage Coming Soon, I suggested that the stars are beginning to align in a way that makes it possible to build companies that offer services like on-line backups.

I was surprised at many of the comments left in response to that post. Many people were of the opinion that it just didn't make sense because US broadband is asymmetrical (read: crippled). The limits imposed on upload bandwidth made it a useless offering.

Andrew said:

I've been hoping for the same thing, but I don't think it can scale with current broadband when you start taking high-res photos, and digital video? I've got about 500GB of data (and probably another 100GB on old Hi-8 tapes I haven't digitized yet) and no matter how cool the online data storage features are it's almost impossible to upload that much data. If you had a T-1 at 1.5Mbs you're looking at 31 days. Even a more modest 50GB of data would take 3 days. Am I missing something?

Greg said:

Yeah, Cheap On-Line Storage *isn't* Coming Soon. Where's your head at? Anything more than a couple gigs takes too long to transfer. And it'll be cheaper just to buy your own disks.

Bob said:

bandwidth is the obvious issue for hundreds of gb, esp considering most broadband is still asymmetrical.

The common alternative a few suggested is roughly "buy a big external USB disk and just backup your own stuff."

But that's a solution that doesn't work for most people. You need to automate or remember to run backups. Most people forget. They're lazy, forgetful, or simply too busy. There are so few people that have a remotely sane backup system in place, that it's pretty depressing.


Another View of the Ladder
Originally uploaded by jzawodn.

That leads me to swimming pools.

The annual opening of our swimming pool was a family ritual for years. It still is, but I no long live anywhere near my parents so it's hard to participate. We had to get all the gear out of storage, remove the cover, clean it, clean the pool, get the filter running, and fill the pool.

The pool held roughly 10,000 gallons of water and took a long time to fill. While some people had been known to fill using a really big hose from a fire hydrant, we did the simplest thing that could work: drop in the garden hose.

It took many hours to fill a swimming pool with the garden hose doing the work, but we really didn't worry about it that much. Sure, the water pressure in the rest of the house was a bit lower during that time. It will still livable.

Eventually, the pool was filled and everyone was happy.

I think the same technique works for on-line backups of one's hard disk. Using Quality of Service (QoS) controls at the network level and some intelligent scheduling on the client side, a service could offer the peace of mind that comes from automated and professionally managed backups that don't bog down the computer or the network.

Sure, there's the one-time cost of getting that first backup done. It's not that different than desktop search.

What am I missing?

Update: Fred Wilson has put his thoughts on-line in Online Backup's Inflection Point, in which he says "I have been using online backup for over ten years in my home. I have even used it over dial-up. You just let the backup go over night."

Yup. :-)

Posted by jzawodn at December 05, 2005 01:38 PM

Reader Comments
# Gary Potter said:

You're missing nothing. Your readers need to take a step back and think about the 'average joe' that does not have a technical background. This type solution is perfect for the technically challenged. They need something that's easy, non-threatening, and inexpensive. I just can see Aunt Nellie installing a piece of hardware but I can see her using something like LiveVault.

on December 5, 2005 02:04 PM
# Jeremy Wright said:

You aren't missing much. Even if you have 500GB of ACTUAL data, how much new data (not counting your internet cache, which wouldn't be backed up) is the average person, even the techie person creating?

100MB? 250MB?

That's where differential backups come in. And if it's a streamed, self-aware differential backup, it'll work on any broadband connection, without taking up more than 56K or so of upstream at a time.

I'd missed the original post. Not sure why your technically literate reader base didn't realize differential backups were the only way to go for something like this. Who does full backups anymore?

on December 5, 2005 02:13 PM
# Michael Moncur said:

Yeah. Worst-case scenario, I come back from vacation with 500MB or so of photos. It might take a few hours to back up the new data, but with an intelligent backup system that ran in the background like Google Desktop, I wouldn't complain.

The only long-term issue I can think of is that broadband providers might have clogged upstream lines if this became widely deployed (example: new version of Windows enables it by default). Otherwise, no big deal.

on December 5, 2005 02:23 PM
# Damon said:

Iron Mountain/Connected also offers this service:
http://www.connected.com/

Like many others mentioned, there's the one time full backup, and then it simply diffs after that. Not bad, and I know many corporations are implementing it for their users to prevent the executive who just dumped his laptop on the airport security check floor from losing everything.

on December 5, 2005 02:24 PM
# Venkatesh said:

Tend to agree with you regarding lack of good online storage tools. But an online storage with hundreds of gigs for single user, I think not. Why companies like this are lacking is because our average user community is not "Always On" right now, but definetly its going to change with better wifi access being promised by Google and others.

I would envision a good online storage service would be one that can put together features like tagging, Social networking/storage and some stuff like wiki colloboration. besides it should be able to associate file types ( for example DOC or excel) with online application like Writely or numsum. The user experience should be mimicking windows explorer and not prompting the user to download the files each time they want to edit or view them.

We are not there yet, but with products like Office Live and other evolving, there definitely an opportunity for the same.

on December 5, 2005 02:26 PM
# Joseph Scott said:

Part of what will make online storage work is that more and more of the content will be created online. It would make sense then that these online storage vendors provide an API so that online content creators can store or backup data. This could be done via REST. Some thing as simple as:

https://username:password@username.mystoragevendor.com/file-mode/path/to/file

So writing a new file might look like:

https://joseph:leetpasswd@joseph.mystoragevendor.com/w/backups/image/mypic.jpg

You could have a file mode called chmod to change permissions, and so on. The vendor should also provide for multiple user accounts for your one storage account. This would allow me to create a new username and password that I could use just for Flickr backups and another just for backing up my del.icio.us bookmarks. With that in place I wouldn't feel bad about providing this account info to Flickr or del.icio.us because those would be the only services using the info and even if the account info was compromised those accounts would only have access to specific directories, etc.

Of course this would have to be exposed in other ways besides just REST, with some way to mount it as a drive letter in Windows. Perhaps WebDAV would be could enough for that part.

All of the components are there, but I think the key is an open API that other vendors can make use of.

on December 5, 2005 02:35 PM
# Wally Punsapy said:

The bottleneck is bandwidth. Building an infrastructure to be able handle gobs and gobs and data transfer; that's the key. Then maybe we will see more video streaming, on-demand storage, Office Live, etc.

I believe this is what the big G is trying to do....it will be open so many doors for companies (such as the big Y!) to deliver some heavy-duty advertising =)

on December 5, 2005 02:38 PM
# Justin Mason said:

Hi Jeremy --

Have you read Ben Laurie's take, at http://www.links.org/?p=27 ?

He points out the two major problems:

1. it'll take ages to upload the data (75 days for 100GB via an 128Kbps DSL uplink, for example).

Sure, your latest set of photos may be 500MB or less, which would transfer more quickly; but when someone *starts* using one of these services, they're not going to want to upload just a subset of their files; they'll want to backup everything they've saved so far, so that'll be the current 80GB of data on their hard disk or whatever.

2. the business model is unprofitable for the service provider: 'whoever produces this product should also give the use of 500 GB for $20 a year. I’m not sure where he buys disks, but where I buy them this means I might recover the original investment in, oh, 10 years or so. So long as the rest of the hardware is free and I buy in huge bulk. Great business model.'

Personally, I'd like to see enclosures that self-organise into networked RAID. that would be cool ;)

on December 5, 2005 02:43 PM
# greg said:

Instead of the business model being a warehouse full of disks, I'd rather see a P2P app developed that scatters my data across a RAID of P2P clients. All the clients involved would 'donate' a portion of their disk in exchange for having their own data backed up.

I want off site storage - an external usb/firewire disk doesn't help you in case of fire or theft.

I've started using foldershare to distribute backup copies of our photos onto the computers of my parents and sister (one lives down south and the other in sunny california). works or the time being, but I'd like something with some access controls. The Garden Hose / Swimming Pool analogy is right on - just set it and forget it. Who has 500GB of data and only a 128kbps connection?

PS - Jeremy - the option to confirm you name is missing in the preview form - meaning I can't preview my post and then post it.
(Your comment submission failed for the following reasons:

You forgot my name.)

on December 5, 2005 03:13 PM
# rick gregory said:

There's a simpler solution... something like the Mirra drive. I've not used it yet, but from comments I've seen you simply plug it into your home network, install a small software client and the client sits in the background and continually backs things up.

Now, the same approach could work for an online service, but with a disk attached to the local network the first full backup will be done in hours, not weeks. From then on the backing up simply takes care of itself (theoretically at least).

Oh, and usual disclaimers... no connection, blah, blah...

on December 5, 2005 03:17 PM
# prefetch said:


I think this is topical...you might want to check out mozy.com - it's basically a better version of connected.com / livevault, except it's free and easier to use.

2GB of space, which I guess is enough for docs, email and a few pics. It's supported by weekly newsletter email w/ ads in it.

It's in beta, but it works pretty well so far.

on December 5, 2005 03:48 PM
# Glen Campbell said:

As mentioned above, Connected has this service. A couple of years ago, I was working with a startup on online storage and backup. We cancelled after discovering that Connected held covering patents on most of the areas that can be used to optimize offline storage.

For example, one method was to take a sha1() of a disk block and transmit it instead of the entire block; if it already existed on the server, you didn't need to transfer the entire block, just make a record that the current user has a copy of it. This is great for files that are common (msword.exe, for example). But Connected holds the patent on that and other, similar technology. Few people will be able to go far with optimization techniques without licensing something from Connected.

on December 5, 2005 04:01 PM
# Joe Hunkins said:

.... I think it's a good experimental idea, but does the value of backing up "compete" with storage stability and the value of a data in general? Seems to me the former is increasing and the latter decreasing so I wonder if storage as a mass market will ever exist.

Could you combine this with analysis/reprocessing of the data without freaking out people to provide an incredibly useful utility for disorganized peeps?

on December 5, 2005 04:10 PM
# Charles said:

No, the filling the pool analogy won't cut it, especially considering the issue of disk backups.

The problem is that water is water, and data is data. Every atom of water is like every other atom of water, but every bit of data is unique and must be stored in sequence. A backup is a snapshot of a unique state of your disk's dataset. If it takes days or WEEKS to do an initial backup, how many changes has your dataset been through between the time you started the backup and the time it finished?
There is an old story about "the Marching Chinese," excuse me if this is a little racist and stereotyped, but I was told this by my high school biology teacher who was a certifiable nutcase. But anyway, supposedly, if you tried to march EVERY Chinese person past a single point in Beijing in a column four-abreast, it would take 30 years to march them all past that point, by which time there would be a whole new generation of Chinese born who would have to take THEIR place at the end of the line. The Marching Chinese would never stop, they would continue marching infinitely.
But consider that maybe you march them past the point in a column one hundred wide, or a thousand wide. Maybe you could get them all past the point in a more reasonable time, before a new generation is born.
OK, back to data backups. There is a certain threshold of bandwidth that you need to pipe your data to the backup, or else you can't even get it backed up as fast as you're making changes or adding new data, sort of like the Marching Chinese. You can't even make incremental backups because there is no point in time that represents a specific state of your filesystem, you backed up over days or weeks while you were making changes to your live filesystem.
But there is one strategy that would solve the problem: take a snapshot of your whole disk system to another hard drive, then stream THAT to an online data vault. But then you've already done a backup, you're just making an online backup of your backup.
Now there is one other problem you haven't considered: RESTORING the backup. I have long years of hard-won experience in backups, and let me tell you one thing: NEVER trust a backup system that cannot be tested to see if its backups can be restored. I know lots of people who used to blindly trust their backup systems, and then one day when they had a disaster and had to do a restore, they suddenly discovered that NONE of their backups were done right and they have nothing that can be used to restore successfully. I used to work at a place where we had a 2nd CPU identical to the one we backed up each weekend, we always restored the backup to a second CPU to test if it would run and was complete, before we considered the backup successful. So that basically doubled the time the backups took.
So excuse me if I am extremely skeptical of online backups. Sure maybe they'll work for peoples' noncritical data stores like a few Flickr pics or some short videos. But until I can get FiberChannel speeds between my CPU and the data vault, it's not going to be practical for backups. On the other hand, once I DO get FiberChannel speeds, I won't even need a local disk, it can all be done directly on the remote data vault. THAT is the breakpoint.

on December 5, 2005 04:18 PM
# Greg said:

Woohoo, I've been quoted! Do I get a check now?

> What am I missing?

- I would never ever use any online backup unless my files were encrypted, and was sure that they could never be decrypted (and it's almost a certainty that they could eventually be decrypted). If not, there's not much at all I'd be backing up. I'm not doing anything illegal, but I don't want my taxes, passwords, personal emails, etc. somehow turning up online due to a hacker, disgruntled employee, etc.

- I would never solely rely on a company for my backups. I will always maintain my own set.

- Even with differential backups, people save lots of additional DVDs, software, mp3's, etc.

- Even if US broadband was symmetrical, it's not that much of a difference.

on December 5, 2005 04:28 PM
# dan isaacs said:

I don't really see much need, honestly. What matters to me? My pictures, and my financial data. My pictures live on my HD, but they also live on Flickr's RAID protected storage as a 1:1 copy. My financial data is all of 100MB. Now, I can encrypt it and put it on a number of different places. My FIL simply burns a CD and puts another copy on a ZIP disk. And of course there are hard copies of the source data from which everything could be recreated rather simply. Very few people have data sets that they canot manage.

I live and breathe data protection. And I can tell you that most data does not need to be protected. Combine that with the security concerns, and I just don't see many people being willing to pay what this would cost. SMBs, yes, but SOHO, no.

on December 5, 2005 05:23 PM
# John Jonas said:

Echo on mozy.com

Good company with good management. It won't be long before they'll give more free storage than the 2GB they're currently offering.

My windows computer is now backed up about every 4 hours with incremental backups automatically.

Now I just have to wait for them to get a linux client.

on December 5, 2005 08:47 PM
# Hanan Cohen said:

Your data, soon backed up to a 40-foot shipping container near you.

http://www.pbs.org/cringely/pulpit/pulpit20051124.html

on December 5, 2005 11:49 PM
# Brandon said:

Charles is dead on here.

I see no privacy issues mentioned here either, and for me sending my data over the network is a showstopper anyway. If I specifically choose which data to back up, then either I have privacy issues or backup integrity issues if I forget to specify correctly.

I also think that for most hard-drive crash scenarios -- hard drive failure NOT corruption -- I think RAID 1 would be sufficient. That's my home setup, supplemented with the occasional backup of critical data to DVD.

on December 6, 2005 05:19 AM
# Pete Cashmore said:

Jeremy,

There are lots of companies taking aim at the online backup space - I reviewed three of them recently:

http://mashable.com/2005/11/24/openomy-omnidrive-and-all-my-data-online-storage-just-got-interesting/

I think AllMyData is particularly interesting - they've created a P2P app that runs in the background and offers free backup, provided you supply some space on your disk.

on December 6, 2005 05:54 AM
# Mike said:

Backups should be simple and automated. At minimum, they should be to another disk. Considering the natural disasters that occur on a regular basis these days, they should be offsite.

I use Cobian backup ( http://www.educ.umu.se/~cobian/cobianbackup.htm ) and streamload. I have 50GB offsite. Cobian is free and I have the $50/yr plan at streamload.

Don’t forget to test your backups by doing a restore every so often.

on December 6, 2005 06:41 AM
# shawn said:

rsync+ssh+cron+remote linux box=secure automatic backups.

i do it all the time and after the initial copy of all the data it works fine. and, i don't have to do anything unless something crashes (which doesn't happen very often).

on December 6, 2005 06:51 AM
# Gudmundur Karlsson said:

I agree with Jeremy, it is becoming very practical to use the net as a backup. I don't agree with the detractors, security isn't an issue, and the marching chinese analogy doesn't apply because for 99% of the users, just leaving the backup running overnight once will do it.

But can't we take this further and imagine using PCs and Laptops as super smart frontends (thick clients) to the net?

Just download client software which provides you with a desktop complete with office applications browser(s), email etc etc, in fact any sofware you want, and all your files reside in a single virtual filespace. The software will take care of the caching, you don't really know where your data is, and in fact 99% of it is stored both locally and on the server side.

You'd be using exactly the same environment complete with all your data from any computer that you use. Start using a new laptop and it will be slow initially, but leave it on connected to the net overnight and it will be fast because all your files are now local.
The user would never need to explicitly do any backups.

on December 6, 2005 08:19 AM
# chris said:

I think you're right on Jermey, This reminds me of my experence with my new video iPod and my first generation 17" powerbook. If you're up on apple hardware you know my pain, The powerbook only has USB1 and sense apple decided to drop support for firewire in favor of USB2 I took a painful 9 hours to update and load everything on my shiny new ipod, the pain is over and it was running in the background so it really didn't bother me by the time my work was done so was my ipod. Now small little updates are no big deal. This model would work!

on December 6, 2005 08:58 AM
# anonymous said:

Hmm, I seriously doubt that Connected has a patent on sha-ing blocks to check for duplicates. Or if they do, that it would hold up -- rsync has been doing this for years.

on December 6, 2005 09:23 AM
# Derek Vadala said:

The upstream bandwidth issues are largely irrelevant. It's true, as many pointed out, that uploading hundreds of gigabytes to an Internet disk system would take weeks, but there are many workarounds, including eventual increases in bandwidth via initiatives like fiber to the home. A decent client fixes this by allowing you to create a synchronization backup plan in which you can prioritize important files by name, directory, file type, or other metadata, while excluding other files based on size, location or media type. I might want to keep a backup of MP3s I rip from CD, but perhaps an online music service I use allows me to re-download files at no cost-- no need to back these up.

Profitability is the real issue. You are talking about allocating multi-terabytes storage systems that can services only a handful of customers. Assume a 100GB account at $10/month with an average utilization of 50%. The annual revenue per 100TB of data is 120K/year, but I think the cost to maintain that storage is going to be a lot higher after you factor:

1. Geographic acceleration.
2. 100% uptime
3. Redundancy
4. Technical support
5. Security/Legal issues

Now there are other companies that make this work on smaller scales, so I think it's achieveable, and I would certainly pay for the service. If you could implement decent caching on the server side, I think you could lower your costs significantly. For example, everyone may have a copy of the latest viral video at 20MB, but you don't want to keep a copy of that for each customer. Ditto for MP3s, videos, etc. This reduces the storage overhead enormously, but more problems arise:

1. Copyright - other companies have tried MP3 vaults and been sued out of existence.
2. Encryption could destroy your ability to cache
3. Security - presumably you use a hashing algorithm to handle the cached/shared content, but what happens when that algorithm is broke, like with MD5.


on December 6, 2005 09:24 AM
# Derek Vadala said:

There is an error in my arithmetic, it should be 240K/year per 100TB. Apologies.

on December 6, 2005 09:57 AM
# Greg Linden said:

I, Cringely had a column about a year ago describing a clever distributed online backup system:

http://www.pbs.org/cringely/pulpit/pulpit20040909.html

Well worth reading. But I do think that bandwidth is an issue. That and other thoughts on it on my blog post here:

http://glinden.blogspot.com/2004/09/i-cringely-on-distributed-backups.html

on December 6, 2005 11:50 AM
# Jason Cartwright said:

Microsoft bought www.foldershare.com the other month. Part of its live.com goodies coming up apparently. Its p2p method of doing things works a treat for backups.

on December 7, 2005 05:54 AM
# Gudmundur Karlsson said:

Security isn't an issue because it has easy solutions.
1. Your normal data such as emails and other documents are at least as safe in a gmail account as they are on the hard drive of any pc that is connected to the internet.
2. If you need extra security, there are lots of tools to password encrypt files etc.
3. You could designate certain documents to use secure protocols when transferred over the net to avoid net sniffing attacks, most data wouldn't need this though.

on December 7, 2005 07:17 AM
# John T. Pratt said:

I think a large pecentage of people have to much data for this. I download ~20Gb or more month, some I keep, some I throw away. My 120Gb drive is constantly turning over 10-20GB of content per month. For people that do multimedia, the amount of constant addition/deletion would be far too much to backup using a service like this - even daily.

on December 7, 2005 08:50 AM
# Derek Vadala said:

Transient data is also easily fixed by a good client with flexible inclusion policies. For example, one of your rules may exclude files that are less than a day old, except for explicit file types, like those for financial programs or word processors.

Don't think of a service like this one as you would think of a traditional scheduled backup. Imagine a client setting where you control the exported data stream based on current system and bandwidth utilization. So your system is constantly synchronizing, except when you are sitting in front of it and are above a certain resource threshold.

As far as those saying security isn't an issue. That's simply laughable.

"1. Your normal data such as emails and other documents are at least as safe in a gmail account as they are on the hard drive of any pc that is connected to the internet."

No they are not. Have you missed the cross site scripting attacks against nearly all major web-based email services? If you ignore local security, then perhaps your own PC is less secure, but your information is way more at risk when it's located some place else--you're now adding the risk that an employee of the provider can access your data.

"2. If you need extra security, there are lots of tools to password encrypt files etc."

Right, but as I said earlier, the more people use encryption, the harder it gets to leverage server side caching, and the higher your overhead.

"3. You could designate certain documents to use secure protocols when transferred over the net to avoid net sniffing attacks, most data wouldn't need this though."

All the connections should be secure.

on December 7, 2005 10:14 AM
# Gudmundur Karlsson said:

Security is very important. I said it's not an issue because I think there are already existing solutions for it in this case.

There is no absolute security, there is only relative security. I'm aware that web mail services have been attacked. But I'm much more aware of a total lack of security of almost every user pc that I look at, which almost always have data collection programs installed on them without the knowledge of their owners. I'm not talking about a user who's aware of how to secure their PC, I'm talking about the average PC user.
Compare the two - I'd go with gmail,ymail whatever.

Also you enter into an agreement with the provider, where they promise to provide a given level of security, and then you trust them to deliver that level of security for you. So you have legal recourse if security is violated.

All connections should be secure, except if it's less efficient (which it is) then you should have the option to gain performance by using insecure connections. Let's face it, no one is interested in 99% of all user data in such a service.

on December 7, 2005 01:07 PM
# Thomas said:

A good compression algorithm and a decent uplink speed serve to minimize the time required to send data using remote backup software. You also are not required(these days) to contract with a commercial service provider, you can actually set yourself up with client/server software to do your own backups (see http://remote-backup.com)containing the cost and ensuring absolute security of the data.

on December 8, 2005 07:07 AM
# Bubba said:

Cringely's 40-foot shipping container is an unrealistic pipe dream. Please think about what's he's saying for a second before assuming he's correct. Read blog.dkpdev.com for a quick write-up on how asinine his thought is.

on December 8, 2005 09:19 AM
# Joe said:

Just no. Not going to happen anytime soon. It's a good idea, but the tech is against it.

You can call me wrong if it happens with 3 years, but it won't. Just watch.

on December 8, 2005 10:12 AM
# JH said:

Hola.

This is off-topic, so feel free to delete it once it's been read:

Any chance that *someone* from Yahoo can explain why blo.gs has been offline for nine days with no explanation of when they're likely to unbreak it? Am a little surprised you've not posted on the subject.

on December 9, 2005 09:09 AM
# NL said:

We've done this for years for small businesses. We use a magic tool ... it's called "rsync." Not only does it compress the data stream to save bandwidth, it's also able to only transmit differences in files through the use of cyclic CRC. Hence you can even backup GB long files. Usually we just prime the first backup with a USB drive.

After that, well, even a basic 128kbps upstream DSL connection is enough to backup hundreds of gigabytes of data nightly, as long as only a few dozen megabytes of them vary every day.

on December 9, 2005 10:00 AM
# ade said:

Maybe I'm missing something but this seems like a trivially easy problem to solve.

Scenario: A professional photographer has 500gb of photos they want to back up and they expect to keep adding several gb a month to that .

Step1: Ship a set of DVDs/hard drives/tapes to the backup vendor. Yes, physically send a large amount of data to someone who will then copy that data onto the media they use for backups. This sidesteps the limited bandwidth of modern networks using the old "never underestimate the bandwidth of a station wagon full of tapes" trick.

Step2: Use your backup vendors differential backup tools (probably some variant of rsync) to send them the changes as they happen. Whenever the difference between what you have locally and what you have remotely gets too big you take another snapshot and pysically send it to them.

Step3: When you need to verify your data or you want to get all of it for some reason: contact the vendor who will then send you a set of DVDs/hard drives/tapes containing your data.

This is not cheap but it avoids the 'marching chinese' problem referred to above and the various snapshots that are physically sent back and forth between you and the backup vendor means data integrity can always be verified.

on December 10, 2005 03:01 AM
# dan isaacs said:

As has been said already, it's an easy problem to solve, but its just not a profitable one.

If we accpet the above calculation that 100TB would provide 240K/yr, we need to have 110TB of disk to support it (filesystems, and keeping a disk more than 90% full is a bad idea). Well, what are the capital expenditures? Even with a cheap solution, it outpaces that figure. Add to that power and cooling, bandwidth, replacement (any idea how many drives fail out of a 100TB worth in a given month, let alone a year?), the cost of admins, server hardware and maintence, and the other administrative overhead of a corporation. It would take you three years to be proftable. at which time you need to replace your hardware. May as well sell lemonade in front of your house.

on December 11, 2005 02:48 PM
# Salman said:

Interesting thoughts, I was personally thinking of using gmail as my backup solution (for personal type backups obv. not my entire drive!).

what would prevent me from using any sort of complete backup solution would be security, just imagine the amount of data someone could get a hold of with a single hack hehe.

on December 13, 2005 07:15 AM
# Konrad said:

I strongly believe that this would actually be quite a value-added service for the access providers -- get an application onto the client PC that uses network idle times to transfer out parts of the backup. Sure, there are power users out there who generate a lot of data, but most users don't run up to the gigabytes per day value. The only change for the customer would be to keep the machine on for a while. The storage would be close, network-wise, and would not even tax the upstream, whilst providing a really good additional service to the customer.

on December 15, 2005 03:15 PM
# Swami said:

Full disclosure: I work for Carbonite -- a new online backup service that was mentioned in Fred Wilson's blog.

Jeremy, you're right as are many of those who've posted comments. For mass market adoption, backup has to be ultra-simple, inexpensive and "always on." Nobody wants to worry about selecting files and folders or schedule backups etc.

$35/month for one of the services mentioned above is not reasonable pricing for most folks. $5/month for 1GB is OK for some, but many don't really know how much 1GB is and those who do probably need more space.

At Carbonite, we think we've got what the mainstream user needs (we like to think of ourselves as "Backup for Everyone" (TM)). We currently have one product on the market (launched nationwide through Staples): PhotoBackup which backs up all your digital photos (whether you have 10 or 10,000) for $30/YEAR. This is just our "training wheels" and we plan to launch comprehensive PC and Mac backup services in 2006. Pricing will be similar -- a very affordable monthly rate for *unlimited* backup. Most importantly, our product requires little or no ongoing user involvement.

Free trials should be available this week at http://www.carbonite.com

I'm happy to give you and readers of your blog 6 months of free service if you email me at swamik@gmail.com.

on December 29, 2005 07:14 AM
# Dubai Web Design said:

new MacBook Pro is acting more and more strangely. First it was random lockups, maybe once every few days, where the whole UI would freeze, even though the mouse pointer would still move. Now it’s frequent appearances of the spinning beach ball, locking up a single application and going away in a couple of seconds.

What’s more worrysome, I started noticing these kind of messages in the system.log:

May 3 15:56:19 MaBi kernel[0]: disk0s2: 0xe0030005 (UNDEFINED).
May 3 15:57:07 MaBi kernel[0]: disk0s2: 0xe0030005 (UNDEFINED).
May 3 15:57:54 MaBi kernel[0]: disk0s2: 0xe0030005 (UNDEFINED).
May 3 15:58:42 MaBi kernel[0]: disk0s2: 0xe0030005 (UNDEFINED).And finally, if I try to verify the disk using Disk Utility, the system will lock up in the manner described above. S.M.A.R.T. status is “Verified”, though.

I couldn’t find anything on Google regarding that error code, but I’m scared that my disk might be on the verge of abandoning me. I need to do a backup as soon as I get back home.

on October 29, 2008 07:07 AM
# Swimming pools said:

What a scenario, you have taken.
IT really nice.
Thanks

on May 14, 2009 05:03 AM
Disclaimer: The opinions expressed here are mine and mine alone. My current, past, or previous employers are not responsible for what I write here, the comments left by others, or the photos I may share. If you have questions, please contact me. Also, I am not a journalist or reporter. Don't "pitch" me.

 

Privacy: I do not share or publish the email addresses or IP addresses of anyone posting a comment here without consent. However, I do reserve the right to remove comments that are spammy, off-topic, or otherwise unsuitable based on my comment policy. In a few cases, I may leave spammy comments but remove any URLs they contain.