In the 1981 edition of Andrew Tanenbaum’s textbook Computer Networks, he asks the student to calculate the bandwidth capacity of a St Bernard dog carrying floppy disks.
Nearly two decades later, when I worked in the Olivetti Research Lab, we had extremely high-bandwidth connections (for the time) to the University Computer Lab about a quarter of a mile down the road. But we used to point out that we could get a great deal higher bandwidth by giving some tapes to Prof. Sir Maurice Wilkes, already in his eighties, and asking him to put them in his bicycle basket. The bandwidth was excellent, though the round-trip latency wasn’t quite so impressive since he would usually have a cup of tea, and often a snooze, before coming back.
Well, plus ça change, plus c’est la même chose, as they say. Recently I had to shift 4TB of data from the Computer Lab here in Cambridge to our corporate sponsors based at the University of Warwick. It’s too hard to connect directly via the various institutional firewalls, so I suggested buying and mailing a hard disk. But, since we already had an S3 account set up for the project, they preferred the idea of me uploading the data to a place where they could download it at their leisure.
Now, S3, for those not familiar with it, is the ‘Scalable Storage Service’: a splendid offering from Amazon where you can upload and store as much data as you like without ever having to worry about running out of disk space. The prices are generally pretty reasonable; you pay nothing to put data in, and a low cost per gigabyte per month for the storage. Transferring out is free for modest amounts, but that price does start to ramp up when you start downloading a lot, as we were to find out.
Uploading 4TB is also not something you should try at home unless you’re going away for a week or two, but in the University we have a pretty good connection, so the upload took me, I think, about 7 or 8 hours. Interestingly, this is comparable to the time needed to cycle from here to Warwick, though perhaps not for an octogenarian. The data was on S3 for about a month, because distractions, and technical issues with permissions, proxies and DNS servers, meant it took a while for my colleague at the other end to download everything. They don’t have nearly such a good connection as we do, so I imagine it took rather longer for them to download than it had for me to upload. So even if we ignore the intervening month, this was a process of at least 20 hours, which, I suppose, is roughly how long an energetic St Bernard might take to do the journey.
And we got the bill, which came to nearly £400.
So, just by way of comparison, an Uber driver from Cambridge to Warwick would charge about £120 and a 4TB drive can be found on Amazon for about £80, giving a total of about half the price of our S3 transfer, and the car would take about 2hrs to get there. If I’d been willing for the transfer to take as long as it took us, I could have used a next-day courier and halved the price again.
Now, I realise there are lots of simplifications here, ways I could have transferred the data at lower cost, etc. But it’s worth bearing in mind some of the challenges with network and cloud services; and I suspect it’ll be some time before the process of shifting physical media around is really behind us!
Recent Comments