One of my friends over on my crappy little forum recently received the following support ticket (and, quite understandably, facepalmed):
Can you please download internet on my system?
Rather than partake in some sympathetic facepalming of my own, I thought I’d come up with a quite literal answer, in xkcd’s “What If?” style.*
The first question we have to answer is: what is the size of the Internet? Any answer will be an estimate at best (and wild speculation at worst), because the cold, hard truth is that no-one knows. That’s because of the distributed nature of the Internet (as well as the underlying TCP/IP protocol suite that the Internet is built on) — with quite possibly millions of servers connected over the world, it’s hard to measure for sure. The other problem: what would count towards the size requirement? Certainly content served over HTTP/HTTPS would count, but FTP? SMTP? NNTP? Peer to peer filesharing? And would any content accessible indirectly (such as data stored in a backend database) count?
The only thing that we have to go on is an estimate that Eric Schmidt, Google’s executive chairman, made back in 2005; at the time, he put the estimate at around five million terabytes (while I’ll round up to 5 exabytes). At the time, Google only indexed 200 terabytes of data, so Schmidt’s estimate probably took e-mail, newsgroups, etc. into consideration. Due to our world becoming more connected in the interceding 8 years, that figure has likely shot up, particularly with sites such as YouTube, Facebook, Netflix, The Pirate Bay et al coming into the equation. I’m going to throw a rough guestimate together and put the figure at 15 EB today, based on my gut feeling alone. (Yes, I know it’s not terribly scientific, and I’ve probably shot way too low here, but let’s face it — what else gives?)
Currently, down here on the southern tip of Africa, our fastest broadband connection is 10 Mbps ADSL. In reality, our ISPs would throttle the connection into oblivion if one were to continually hammer their networks trying to download the Internet like that (contention ratios causing quality of service for everyone else to be affected and all of that), but let’s assume that, for the purposes of this exercise, we can sweet-talk them into giving us guaranteed 10 Mbps throughput. 15 EB of data works out to a staggering 138,350,580,552,821,637,120 bits of data, and given that we can download 10,000,000 of those bits every second (in reality, it will be lower than this due to network overhead, but let’s leave this out of the equation), it would take almost 440,000 years to download the Internet over that connection.
But actually, with that length of time, you’d never be able to download the Internet. Considering that the Internet went into widespread public use in the early 1990s (not considering the decades before when the Internet was pretty much a research plaything), the Internet is growing at a faster rate than one can download it using a 10 Mbps connection. Plus, given the timeframe involved, the constant status update requests on the support ticket would drive all involved to suicide, even if (actually, particularly if) we discover a way of making human immortality a possibility in the interim. Clearly, we need something a lot faster.
Enter the WACS cable system. It’s a submarine cable that links us up to Europe via the west coast of Africa, cost US$650 million to construct, and has a design capacity of 5.12 Tbps. If we could secure the entire bandwidth of this cable to download the Internet, we could do it in a little over 10 months. While we may still have the aforementioned suicide problem, this is far more like it.
But of course, what point would we have downloading the Internet if we can’t store the data we just downloaded?
Currently, the highest capacity hard drives have a capacity of 4 TB (here’s an enterprise-level example from Western Digital). We’d need a minimum of 3,932,160 such drives to store the Internet (in the real world, we’d need more for redundancy, but once again, let’s not worry about that here). Our enterprise-level drives use 11.5 watts of power each, so we’d need ~45 MW of power to simply power the hard drives alone; we’d need plenty more (and I’m thinking around 10 to 15 times more!) to power the hardware to connect all of this up, the building where this giant supercomputer will be housed, and the cooling equipment to keep everything running at an acceptable temperature. We’d need to build a small power plant to keep everything running.
So yes, you can download the Internet. You just need a major submarine communications cable, tens of millions of hard drives, and a small power plant to provide enough electricity to run it all. If you get started now, you can give someone the present of One Internet** when next Christmas rolls around. The question of dealing with bandwidth and electricity bills is one that I will leave to the reader.
Now get going, dammit!
* Randall, if you’ve somehow stumbled upon this and you think you could do a better job than myself, go for it!
** Though, depending on who the recipient is, you may or may not want to include 4chan’s /b/ board.
UPDATE #1: I was asked to up it to 50 EB, which on retrospection may be a more realistic size for the Intranet than the 15 EB I put forward earlier. That would take almost 3 years to download on WACS and would require 13,107,200 hard drives with a significantly increased power requirement. The Koeberg Nuclear Power Station (not too far away from the WACS landing site at Yzerfontein) has two reactors, each capable of producing 900 MW, so if we take Koeberg off the national grid (which will cause the rest of the country to experience rolling blackouts, but hey, it’s in the name of progress!) and use the entire nuke plant’s capacity to power our supercomputer and related infrastructure, that should just about do it.