Do you ever think about the enormous amounts of data we consume?
In a response to last week’s column, Charles Manning noted that RAM on his current board transfers data at 3 GB/sec, which is about the same as 25 tons of punched cards/second.
It got me thinking. A 50 Mbps cable modem link, which is not all that fast today, is equivalent to about a ton of cards in 6 seconds. (Assuming all 80 columns were punched). That would fill a typical office in under a minute.
Of course, most of that is junk, like ads for ED, or clickbait (“See what NASA has been covering up for 60 years!”). Thankfully most of it goes to the bit bucket.
A selfie from my digital camera would weigh twice as much as I do if stored on cards.
In the old days embedded developers used punched paper tape. If that cable modem were punching tape it would generate about 10 miles per second. That is faster than the Earth’s escape velocity.
A gigabit Ethernet link would create enough punched cards to fill one of Facebook’s new data centers in about 2.5 hours. It would overload a mile-long train in a couple of hours. The cards to store one day’s worth of Tweets would fill three data centers.
Store gigabit data on the first disk drive, IBM’s 350 which was introduced in 1956, and you’d fill 20 drives per second. IBM rented them; in today’s dollars those 20 units would cost a half-million dollars a month. Yet they’d only store one second’s worth of data. But with a data transfer rate of 9 KB/sec they couldn’t keep up with the stream. Ironically, the development of that drive was cancelled for a while as it was considered a threat to the company’s punched card business. Amazingly, the 350 occupied more volume than the equivalent number of punched cards it could store.
Nine-track tape was once the cheapest way to store data. A 1600 BPI reel would fill in a half-second if gigabit data were streaming into it. A tape library would fill in a few minutes.
The tape would have to move at a mile/second.
To load a tape one would send a message to the operator to find and mount a particular reel. That took around ten minutes to half an hour. So to keep up with the stream you’d have to hire at least 1200 operators.
Feed that gigabit link into 1702 EPROMs (256 byte capacity) and you’d fill half a million a second. Print the stream and every 3 minutes a mile-high stack will pile up.
The Cray 2 was the fastest computer in the world in 1985. If one bought a fully-outfitted four-CPU variant with the maximum number of hard drives attached, that gigabit link would fill all of the storage in four minutes. We’re talking about a $35 million dollar machine (twice that in today’s dollars).
NSA’s Utah Data Center can, by some estimates, store on the order of 10 exabytes of data. That’s the equivalent of 10 million 1 TB drives worth over half a billion dollars. Perhaps they get a quantity discount.
A gigabit feed would fill their servers (assuming they deleted all of our emails, web surfing habits, porn, and other nuisance data already vacuumed up) in 300 years.
But, no doubt storage technology will improve over that third of a millennia.
Or, if those servers were completely filled with our emails, web surfing habits, porn, and other nuisance data and it were punched onto cards, the stack of standard boxes of cards would reach from here to Saturn. If the boxes were set out one deep they would cover roughly the entire surface of planet Earth. With an IBM 1402 card punch it would take 400 million years to punch them out.
The Large Hadron Collider produces around 27 TB of data per day. If stored on the 4 TB drives available at Best Buy, a stack about a foot high would be needed, which would cost a bit over a thousand dollars. Punched cards containing that data would weigh almost 200,000 tons. 20 million 3.5” floppy disks would be required, resulting in a stack 40 miles high. You can still buy these, but that heap of disks will run 10 megabucks.
Some early computers used Williams tubes as storage. Bits painted on a CRT were read off electrostatically. The Williams tubes required to store one day’s worth of LHC data would cover an area the size of Washington, DC to a depth of 200 miles.
That’s so high it could be a hazard to the International Space Station.
Some might argue that would be a better use of the space.
Thanks to Scott Rosenthal and Charles Manning for thoughts for this article.