Conventional wisdom holds that when it comes to gaming, your CPU doesn’t matter all that much. Ever since 3dfx launched the original Voodoo 3D accelerator, we’ve strived to offload more and more work to a dedicated GPU, leaving the CPU free to handle other tasks. Nvidia and AMD have driven this trend for the past 12 years, first with dedicated hardware transform and lighting (T&L) and again when Nvidia launched the G80 back in 2006.
Conventional wisdom, it turns out, is moderately wrong. Scott Wasson at Tech Report has just published a major investigation into the topic. Instead of simply measuring average frames per second (fps) or providing a Min/Max/Average grouping, Tech Report also breaks down the data by frame time (graphed over the benchmark run). The term “frame time” refers to the time (in milliseconds) that it takes to render each frame. For example, a perfectly steady 60 fps rate has a frame time of 16.67 milliseconds (1/60).
Frame times are a less familiar metric than frame rate, but they capture an issue that standard fps measurement doesn’t. Even a Min/Max/Average plot is a graph of static values, measured in one second intervals. The human visual system, however, is capable of detecting variations in motion speed over a considerably shorter period of time.
Wide variations in frame time create split-second stutters that the eye sees and notices, even if the game runs at an average of 60 fps. This is sometimes referred to as microstuttering, though that term is generally reserved for describing a particular type of GPU latency when rendering in a multi-GPU configuration.
Examining CPU performance with frame times gives a very different perspective on their relative performance. Tech Report tested multiple games on a vast range of CPUs; I’ve borrowed a single comparison between two chips in The Elder Scrolls V: Skyrim to illustrate the difference.
Here you can see the average frame rate, the time each chip spent above 16.7 milliseconds (meaning how often did it take the CPU more than 16.7ms to draw the frame), and what the draw time was for the bottom 99 per cent of frames. The fps difference between the Core i5 and AMD’s X4 980 is significant — the i5 is 30 per cent faster — but the Phenom II is still way above the 60 fps threshold. Compare the amount of time spent above 60 fps, however, and the gap becomes an ocean. The X4 spends more than five times as much time below the 60 fps mark. This translates into more stutters of the sort that the human eye picks up on.
Relative performance between Intel and AMD depends on the game, but there’s a significant gap between the two companies overall. The Phenom II X4 980 remains AMD’s best overall gaming CPU and many of the Intel chips are grouped in a relatively small space. Overall, performance is still mostly GPU dependent, but this work shines a light on areas where CPU choice can still make a modest difference.
You can read more at Tech Report.