ARM pushes Mali to drive UI evolution

As smartphones get increasingly faster processors, it's not just games that look to harness that power. We chat to ARM's Ian Smythe about the evolution of the user interface, and what that means for the humble smartphone.

Smythe is the director of marketing for ARM's Media Processing Division, formed in 2007 following the company's decision to develop its own graphics processing technology by acquiring GPU specialist Phalanx back in 2006.

"Effectively, in four years we've come from 'okay, we want to do GPUs' to, 'we have world-class, proven, ecosystem-supported, delivered technology in a tier-one handset,'" Smythe explained during an interview with thinq_. "I think that's not bad going."

The company's flagship technology, the Mali range of graphics processors, is seeing rapid takeup in the market and was recently chosen by Samsung to form the heart of its Galaxy S II Android-based smartphone. The combination of ARM's Cortex-A9 CPU and Mali GPU technology - hidden behind a black-box solution that Samsung simply calls a 'dual-core application processor' - is smashing benchmarks and proving that the British chip giant has what it takes to compete with the likes of Qualcomm's Adreno and Nvidia's Tegra.

"When you think of graphics everyone thinks of mobile gaming, right? It's the first thing anyone turns to: 'Ooh, yeah, mobile gaming, that's what you need graphics for.'" Smythe joked. "Okay, yes, mobile gaming is important, but the other big change - and something which people don't necessarily associate with graphics - is UI. The user interface is pretty complex - take the Android homescreen, which might have an active wallpaper which is using OpenGL ES 2.0 rendering, and then it has some transparency...

"You'll find that there are up to six or seven layers of transparency, and even on a wide-VGA screen size you're potentially rendering things up to seven times because of all these levels of transparancy. All of a sudden you've got a great big graphics load, and the CPU's not necessarily able to cope. In terms of the evolution of Android, we're seeing that, increasingly, OpenGL ES is required with hardware acceleration, just to do the homescreen, just to do the user interface."

That evolution Smythe refers to is exemplified in Honeycomb, Google's tablet-centric release of the Android mobile platform. Unlike its smartphone-targeted variant, Honeycomb features a smart 3D 'holographic' interface and adds a language called Renderscript for third parties to harness the same technology in their applications. It's pretty, but it's also pretty draining in terms of system resources - the real reason why the smartphone version doesn't get the same user interface.

While it's possible to look back at earlier versions of Android and see how the technology has progressed to make ever-increasing demands on the system's compute resources, it's also possible to look into the future and see that demand continue to increase. Already there are ARM licensees like Qualcomm and Nvidia preparing quad-core processors, and Chinese gadget maker Meizu has confirmed plans to release a quad-core smartphone by September. Even if, as with the launch of the original dual-core smartphones, that power isn't harnessed now, it will be in the very near future.

"There's just this massive, massive complexity that you can push forward as people figure out how to make these very complicated compute machines easier to use. The UI experience - you know, we want to interface, we like touch, and in fact we like gesture - we'd prefer gesture to be better, and oh look! That needs more maths," Smythe explained. "We're looking at a gesture UI and a 3D UI."

That's the Media Processing Division's main aim, Smythe explained. "We're trying to deliver the ultimate user experience by combining the best of CPU and GPU to put massive amounts of processing power into the hands of developers. You know that, as ARM, we're not going to be the guys that are creating the applications, right? What we're going to be doing is working with that ecosystem to say 'guys, this is what we're giving you, what do you need, how do we go through and do this,' and going and talking to them, talking to the partners, and saying 'this is what's coming, how can we do it better? How can we help you achieve the next level?'"

While gesture-based 3D user interfaces might be a few generations into the future, that extra processing power is already heralding a sea change in the way people interact with their devices. "That content that we're developing, that user experience, has to be available across multiple screens," Smythe explained. "Increasingly, people want to move their content between screens. All the content will not be the same, but it's going to look the same.

"In mobile," Smythe explained, "a lot of that will require increasing innovation - we're trying to find ways to deliver increased performance in an embedded space without burning all those watts just by ramping up the clock speeds or by sticking more memory on the device. Those options aren't generally available to us in a mobile platform.

"Screens are getting bigger. I have a device in my hand at the moment that can output 1080p - it's a phone, but it can output 1080p, which I think is slightly mad, but apparently I'm one of the few that think like that," Smythe joked. "We're not accepting blocky images, we don't accept poor animation. What that means, in terms of quality: do you know what? That means more maths per pixel. Essentially, everything we do that changes what we put on the screen means we have to do more maths. At a very simple level, it's more maths."

With games publishers already looking to bridge the gap between mobile and console gaming with buy-once play-everywhere titles, the smartphone is increasingly evolving into a more flexible device with greater power than ever before. IBM, which saw its Personal Computer concept reach the ripe old age of 30 today, claims to be seeing the 'post-PC' era - and companies like ARM and its licensees appear to be at the forefront of that revolution in computing.