Intel is risking the ire of hardcore gamers by suggesting that PC gaming can learn a trick or two from the world of consoles by altering a game's rendering resolution on the fly in order to boost performance.
Intel's latest chips, the Sandy Bridge series, pack some impressive graphical horsepower compared to the company's previous attempts, but still lag behind the performance of dedicated graphics from AMD or Nvidia. While that hasn't been a problem in the past - as there is a significant cost for an OEM to include discrete graphics with its product - AMD's move to the Fusion platform and the 'accelerated processing' concept threatens Intel's hold on the integrated graphics market.
While Intel is undoubtedly investigating the creation of high-performance graphics hardware - as John Hengeveld suggested to thinq_ during an interview at the International Supercomputing Conference earlier this year - it needs to develop a method for improving games performance on its existing hardware first. One step on that path is a technology called 'dynamic resolution rendering,' which promises to significantly improve a game's performance on slower hardware.
Developed by Doug Binks and first unveiled during a presentation at the Games Developers Conference 2011, the technique is a relatively simple one: by allowing a rendering engine to alter the resolution at which it renders in-game objects, performance and quality can be balanced on the fly to keep the framerate ticking over at an acceptable level on even the slowest hardware.
"Dynamic resolution rendering involves adjusting the resolution to which you render the 3D scene by constraining the rendering to a portion of a render target using a viewport, and then scaling this to the output back buffer," Binks explains in a white paper detailing his implementation of dynamic resolution rendering. "Graphical user interface components can then be rendered at the back buffer resolution, as these are typically less expensive elements to draw. The end result is that stable high frame rates can be achieved with high quality GUIs."
It's a technique which is novel in the world of PC graphics, but relatively well known in the world of console gaming. While PC games typically ask the user to choose a resolution and then perform all rendering at that resolution, console games often compensate for relatively underpowered hardware by rendering internally at a lower resolution before upscaling the resulting image for output. While dynamic resolution rendering goes beyond such 'cheats,' it clearly has its origins in the world of consoles.
Since the white paper was written, Binks has been approached by several games companies to explain that the technique is in common use in console games, with LucasArts going public with details of its own take on the technology for improving anti-aliasing without the usual performance penalty.
Binks implementation is new for the PC world, however. As screen resolutions have increased, gamers have been forced to buy ever-faster hardware to accommodate the increased demand of rendering larger and larger scenes. Those on older hardware can drop the resolution to gain performance, but running a game at anything other than the native resolution of a digital display ruins the quality and leaves images looking muddy and blurred.
"Rendering the graphical user interface at the native screen resolution can be particularly important for role playing, real time strategy, and massively multiplayer games," Binks claims. "Suddenly, even on low-end systems, the player can indulge in complex chat messaging whilst keeping an eye on their teammates' stats."
Binks also believes that dynamic resolution rendering could have a serious impact on the capabilities of gaming laptops. Using sample code, Binks was able to drop the power draw of an Intel Core i7 chip to 0.7x its usual level by using a half-size viewport for the rendering while maintaining the graphical interface at the display's native resolution for improved clarity.
While the lowered rendering resolution results in a drop in image quality, Binks suggests a technique for improving the perceived quality with little to no cost in performance: temporal anti-aliasing. "Temporal anti-aliasing has been around for some time," Binks explains, "however, ghosting problems due to differences in the positions of objects in consecutive frames have limited its use. Modern rendering techniques are finally making it an attractive option due to its low performance overhead."
When used with dynamic resolution rendering, temporal anti-aliasing has a dual-purpose effect: if the dynamic resolution is lower than that of the back buffer - on a low-end system that needs help to maintain framerates, for example - it interpolates new pixels, creating the illusion of greater complexity than the original rendering resolution would suggest; when the dynamic resolution is higher or equal to that of the back buffer, it acts as a standard anti-aliasing system and reduces jagged edges at a very low performance cost.
Binks also suggests that dynamic resolution rendering can be used when effects such as motion blur are in place, increasing performance while maintaining a roughly equivalent image quality. In tests, Binks was able to reduce the frame time from 16.4ms to 11.3ms using his sample implementation without any appreciable cost to image quality.
While the implementation created by Binks is far from complete - he suggests a variety of improvements that could be made, including the use of a similar technique that works on shadow maps, a separate control mechanism for particle systems, and the option to run higher quality anti-aliasing systems on important portions of the display without affecting the rest of the image - it promises some impressive performance gains.
"Dynamic resolution rendering gives developers the tools needed to improve overall quality with minimal user intervention, especially when combined with temporal anti-aliasing," Binks concludes. "Given the large range of performance in the PC GPU market, we encourage developers to use this technique as one of their methods of achieving the desired frame rate for their game."
While it's not just Intel that stands to gain should the technology be implemented by game developers - as the technique is device agnostic and will work across all graphics hardware, even potentially making the leap to portable devices - the fact that dynamic resolution rendering promises to make games playable on lower-performance GPUs than ever before would be a massive boost for the company's integrated graphics processor range. That's something which Intel will be keen to encourage in the face of AMD's APU line-up, which has the potential to eat away at Intel's bottom line.
The full white paper, complete with sample code, can be read over on the Intel Software Network.