Skip to main content

Is Moore’s Law still relevant to desktop speeds?

Is Moore's Law still relevant? The actual term refers to a prediction by Intel's Gordon Moore back in 1965 that transistor density will double every 18 months (later changed to every 24 months). That doubling of transistors every two years is still on pace, even though it seems like the industry keeps requiring new technology to make it happen.

For most of us in the PC space, we've taken Moore's Law to mean that the speeds of our desktops will double every two years. In the first couple of decades of personal computers, that meant increased clock speed, going from the two million instructions per second (2MHz) of the original Intel 8080 to the 4.77MHz of the original IBM PC, to computers that ran at over 3GHz starting about a decade ago. But heat and leakage issues have pretty much kept clock speeds from going up very much in recent years, so chip makers have turned to other methods to improve their chips, including more cores, multithreading, and improved microarchitectures. Still, the question remains: Is performance actually improving?

For the past five years or so, I've been running pretty much the same spreadsheet tests on multiple computers, trying to gauge what new generations of PCs have to offer business users. (Plenty of other folks test rigs aimed at enthusiasts and gamers). What gets lost sometimes in the incremental gains we see year on year is just how far we have come when you look over time.

As I was testing Intel's new Ivy Bridge processors and AMD's new Trinity processors on laptops, I went back and looked at how the results have changed over time.

Just looking at high-end desktop processors, the differences really stand out. Here's how Intel processors have fared while running a complex Monte Carlo Simulation. (Note that some of these processors were overclocked – the base speed is listed in parentheses).

As you can see, the improvement was really noticeable for a while, and then the test got so fast that I’m questioning the accuracy of my finger reflex on the stopwatch. Here's what it looks like as a graph:

And here is the same data for AMD processors:

And here's that as a graph:

I'm keeping the two lists separate, so you can more easily see the changes over time, but any way you look at it, we're seeing huge improvements. (For this article, I left out AMD's A-series APUs, known as Llano and Trinity, as they are slower than the CPU-only parts.)

While the improvements on the Monte Carlo simulation are quite visible in some ways, what I've been seeing on even bigger spreadsheets may be more meaningful because these are calculations that took a couple of hours a few years ago, so reductions really matter. This chart shows the results on a real-world spreadsheet that contains a very large data table on the fastest Intel desktop processors over the past five years:

Unlike Moore's Law, it hasn't really been getting twice as fast every two years, but over the past five, it's gone from taking 90 minutes to under half that, a pretty impressive improvement.

Looking at this as a chart shows the trend line very clearly:

I've also seen some improvement, though not nearly as much, with AMD processors, despite the addition of more claimed physical cores. AMD's "Bulldozer" architecture in its FX series, where two integer cores share a single floating point core and other components, just doesn't seem to be performing well for this kind of work. But of course, not everyone works with this kind of data, and there's often a big difference in price as well.

Still, the performance differences over time have been significant, especially on the Intel processors. If it's not quite doubling every two years, it's still a big improvement.

Of course, from the perspective of chip and PC makers, the problem is that most people simply don't notice these improvements. Gamers do, but they mostly focus on graphics anyway. For those of us who use our computers for calculations, new processors can matter, and it's great to see continued improvement.

Michael J. Miller is Chief Information Officer at Ziff Brothers Investments, a private investment firm. Mr. Miller, who was editor-in-chief at PC Magazine from 1991-2005, authors this blog for PC Magazine to share his thoughts on PC-related products. No investment advice is offered in this blog. All duties are disclaimed. Mr. Miller works separately for a private investment firm which may at any time invest in companies whose products are discussed in this blog, and no disclosure of securities transactions will be made.