Monday, February 4, 2013

Checkerboard Puzzle, Moore's Law, and Growth Prospects

My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, put one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:

Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--and it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.