Saturday, September 8, 2012

A post full of follow-ups

You know how I dig follow-ups, so here's a grab bag of them...

Goetz Graefe follows up one of the most famous system design principles: Gray and Putzolu's rule for trading off memory and I/O: The five-minute rule twenty years later, and how flash memory changes the rules. The question Graefe considers is: does flash memory change the equation?

Flash memory fills the gap between RAM and disks in terms of many metrics: acquisition cost, access latency, transfer bandwidth, spatial density, and power consumption. Thus, within a few years, flash memory will likely be used heavily in operating systems, file systems, and database systems. Research into appropriate system architectures is urgently needed.

Graefe's conclusion is that the equations need minor adjustment:

The 20-year-old five-minute rule for RAM and disk still holds, but for ever-larger disk pages. Moreover, it should be augmented by two new five-minute rules: one for small pages moving between RAM and flash memory and one for large pages moving between flash memory and traditional disks.

Moving on, the MIT Internet Traffic Analysis Study (MITAS) has been studying how the Internet protocols have been continuing to evolve. One of their papers, by Bauer, Clark, and Lehr, is a follow-up to Van Jacobson's famous work in the mid 1980's to combat Internet congestion: The Evolution of Internet Congestion. As regular readers of my blog know, I consider the TCP congestion control algorithm to be one of the most fascinating algorithms ever designed, and 25 years later it continues to reward careful study.

What is interesting is that over the history of the Internet the answers to how congestion should be managed have varied. The dominant answer in operation at any time has almost always had detractors and competitors – no universal consensus has ever existed. This is unsurprising given that (in some respects) this is a debate about what is "fair" and about what is economically efficient to deploy and operate.

Although it's not really a follow-up, Ben Moseley's intriguing paper Functional Relational Programming: Out of the tar pit is similar, as it is a deeply-considered and fascinating response to one of the most famous computer science papers of all time, Fred Brooks's No Silver Bullet.

I've enjoyed Moseley's paper, although I can see why his proposals are controversial. The core idea that he proposes is to marry Functional Programming with Relational Database Theory, two great tastes that generally aren't seen in the same sentence. Moseley's paper is easy to read and certainly worth your time. One note, though; most of the Internet links to the paper are now dead, as Moseley's primary home page appears to have vanished. So either go to Archive.org, or do your own Google searching for "Out of the tar pit Ben Moseley" to find it.

Jumping to a completely different sort of follow-up, there was news this week about Vitaly Borker, one of the Internet's most extreme personalities: Web Businessman Sentenced for Threats

Mr. Borker was the subject of a November 2010 article in The New York Times in which he claimed that frightening consumers was a way to generate Internet publicity about his business, which purportedly elevated his profile in Google searches, generating more traffic and revenue. His theory was that any kind of online chatter lifted DecorMyEyes in Google’s rankings.

A few days after the article was published, Google announced on its blog that the company was “horrified” by Mr. Borker’s strategy and in response had already tinkered with its algorithm so that “being bad is, and hopefully will always be, bad for business in Google’s search results.”

But the best follow-up of all, in my grab-bag of follow-ups, has been Laurent Bossavit's quirky and yet compelling e-book: The Leprechauns of Software Engineering: How folklore turns into fact and what to do about it.

When we look closely at some of the “ground truths” of software engineering - the “software crisis”, the 10x variability in performance, the cone of uncertainty, even the famous “cost of change curve” - in many cases we find each of these issues pop up, often in combination (so that for instance newer opinion pieces citing very old papers are passed off as “recent research”).

Because the claims have some surface plausibility, and because many people use them to support something they sincerely believe in - for instance the Agile styles of planning or estimation - one often voices criticism of the claims at the risk of being unpopular. People like their leprechauns.

In fact, you’re likely to encounter complete blindness to your skepticism. “Come on,” people will say, “are you really trying to say that leprechauns live in, what, Africa? Antarctica?” The leprechaun-belief is so well entrenched that your opposition is taken as support for some other silly claim - your interlocutors aren’t even able to recognize that you question the very terms upon which the research is grounded.

The book is for sale at an unusual "pay what you want" price (I paid the suggested $10). I haven't read the entire book, but I'm quite enjoying the parts I've read so far.

Are you interested in software engineering, in how we educate our future engineers, and in how we can bring some real intellectual rigor to the software engineering profession? Give Bossavit's book a try; I don't think you'll regret it.

No comments:

Post a Comment