The launch of iPhone 8 promises to bring with it augmented reality (AR) technology. When the new device launches, we could see the creation of apps that blur the boundaries between the digital world and reality in new and imaginative ways. So, will Apple’s new software win over AR fans, and what could this mean for the future of app development? Adam Fleming, chief technology officer of Apadmi, discusses.
Unveiled at this year’s WWDC event, and due to be released in September, iOS 11 will be accompanied by another eagerly-anticipated launch: Apple’s ARKit. Intended to help businesses create unique, innovative AR apps via their own development teams, the ARKit puts the power of AR development into the palms of companies across the world. It will make the lives of skilled developers much easier, allowing them to implement graphical, scene-based apps in AR environments far more easily.
It’s already been labelled the most important platform that Apple has created since the App Store in 2008, and developers are gearing up for a new generation of handsets with more scope for creativity than ever before – but what will this mean for the development of AR?
The iPhone 8 and AR technology
A quick Google search should tell you all you need to know about the possible high specs and features that are due to be released with the iPhone 8. However, the one that shows the greatest promise for AR is a new camera that can understand 3-D space and depth-perception.
Rumours around this technology were first sparked by Apple’s acquisition of Primesense in 2013. Primesense was one of the first companies to explore this technology in real depth, and was involved in the development of the Microsoft Kinect, which brought depth-aware motion-capture to the home-entertainment market.
There was also a degree of excitement around Google’s announcement of project Tango, which brought the same technology to mobile. The need for specialist hardware limited the take-up and utility of Tango, but if Apple manages to bring this breakthrough to the mass-consumer market, the connotations reach far beyond consumer apps; enterprise solutions in manufacturing, construction and utilities (to name a few) would most likely spring up overnight.
Could Apple change the course of AR?
AR has quickly become one of the most talked about technologies - from Microsoft’s HoloLens glasses that enables the wearer to digitally interact with their physical environment thanks to in-built holographic processors, to Google’s updated Glass Enterprise Edition, which hopes to revolutionise industries that require an element of AR in a handsfree environment.
However, with so much of this tech already explored on the market, why all the fuss? Is this just another fad? While examples such as those above prove that AR isn’t exactly something new, what Apple has done is combine a set of technologies to create a product that is available “out of the box” on one of the world’s biggest platforms.
Simply put, by building this into the OS and creating handsets (and tablets) with mass-appeal that support these capabilities, Apple will have created a unified platform that can support features that are otherwise only available in a fragmented way. They’ve effectively created a platform for AR that will be in the pockets of millions from the first week of deployment – with a developer community equally as enthusiastic, and ready to build solutions that make full use of its advantages.
Not only will this help businesses to improve their customer experience and enable them to interact with customers directly, it will also allow people to shop, travel and be entertained in entirely new ways.
AR and what’s available right now
Imagine navigation apps providing real-world directions overlaid on what you can see through the camera, fashion outlets where you can “try on” clothes without entering a changing room, and interior design that takes place in your living room, allowing you to see exactly how a new sofa will fit into your space, or how big that expensive painting really is.
There are already plenty of examples where AR is being trialled – Swedish furniture retailer, IKEA, has already partnered up with Apple to develop an AR app that will allow customers to overlay images of furniture over a real-world view of rooms in their homes. The 3-D camera would then scan the room and measure whether the furniture will fit into the space, without homeowners having to dig out their tape measure.
AR could also make boring consumer tasks more enjoyable too – car-maker Genesis has already announced that it’s ditching its conventional vehicle manual in favour of an AR app for their latest models. Not only will it be home to dozens of ‘how-to’ videos and advice articles, but AR features will guide you around the car helping you to identify car parts through the device’s camera, with the option to tap for more information in certain areas.
These are just a handful of uses, but thanks to the release of Apple’s ARKit, we can expect more of this innovation in future.
Eye-tracking and aquisitions
One of the areas of AR that’s also received a lot of attention is eye-tracking – in particular, since Apple recently acquired SensoMotoric Instruments. The German company specialises in developing the eye-tracking technology behind VR and AR headsets, which enables the programme to react to the eye movements of the user with speed and precision.
Eye-tracking makes the use of AR in non-phone scenarios far more tenable. Being able to focus the attention of the augmentation technology on the same things the user is looking at allows for much cleaner, efficient interfacing – not to mention it saves all that effort wasted by processing elements in the scene which the user has no interest in. Whilst Apple doesn’t have an augmented vision product (like Google Glass) yet, if it’s serious about AR – and all signs suggest it is – it’s too big an area for the company not to be working in.
As developers, it’s really exciting for us to see this kind of investment at a hardware level, and we can already forsee the simple but effective differences it can make. For example, tracking when a user has finished reading a page of text or looking at a diagram sounds trivial, but for those who need hands-free technology, such as surgeons or mechanics (or just those of us squashed onto the tube at rush hour), automatically turning the page can make a huge difference.
In practical terms too, gathering analytics to determine which parts of an image or video a user is most interested in will help companies create better, more tailored experiences for the user, as well as more interesting interaction methods.
With the ARKit already available in beta form, it’s given developers a clearer idea of where Apple is focusing it’s attention. They’re already exploring the exciting possibilities this has opened up, and it’ll be interesting to see just how sophisticated the ARKit becomes over the next few years – the prospect of a mainstream AR platform has never looked more promising.
For more information, please visit our website.
Adam Fleming, Chief Technology Officer, Apadmi
Image Credit: Ahmet Misirligul / Shutterstock