By Adam Mingay, Client Partnerships @ UNIT9
With the new iPhone 12 set to be unleashed on 13th October, you’d be forgiven for thinking we will face another deluge of misplaced hype. After all, with the iPhone now in its second decade, there are no giant development leaps left to make, right?
Wrong. The upcoming launch will be a gamechanger. And it’s all thanks to the phone’s new LIDAR (light detection and ranging) sensor – aka the mysterious fourth camera dot that’s set to scale up AR (augmented reality) like never before while placing a stake in the ground for Apple’s new – and potentially culture defining – AR ecosystem.
LIDAR sensors project light beams across the light spectrum to precisely map out environments. Unlike a normal camera sensor which deals in colour, LIDAR continuously contextualises objects in the world around it in the form of a point cloud. This means that AR elements, such as digital characters or objects, can be embedded into a user’s real world surroundings in a super-efficient, accurate and convincing way.
LIDAR has been around for 60 years, aiding everything from the Apollo missions to mining, but only now has it shrunk in size enough to become an everyday technology. Although imitation LIDAR scanners are already available on other smartphones, like the ‘Time of Flight’ sensor on the Galaxy S20+, Apple’s foray into LIDAR with its flagship device signals what a gamechanger this could be for handheld AR experiences.
By enabling far more detailed tracking of objects, LIDAR paves the way for AR objects and characters to jump from one surface to another with far greater accuracy. This, naturally, increases a scene’s potential interactivity. And when content becomes more seamlessly interactive, it becomes more memorable and engaging – the Holy Grail for marketers.
One of the new iPhone’s most significant AR advancements is enhanced ‘occlusion’, where real world objects can be made to appear between the camera and the AR effect. Whereas ‘traditional’ AR was limited as the AR element always had to be at the front of the experience, occlusion means that an AR element – let’s say a superhero, for example – can now disappear behind the sofa. This further blurs the lines between the augmented and real worlds to uphold the suspension of disbelief.
To get a bit techie, LIDAR operates at the photon level at nano-second speed, allowing for a much faster AR set-up time. When combined with the iPhone12’s rumoured A14 bionic chip, this not only allows for a more detailed understanding of the scene, but also a faster set-up time than normally required in AR applications. This means the iPhone 12 has become our AR oyster, with near limitless creative potential spanning full environment video games, full room shopping trials, plus full body and face tracking augmentation. The possibilities are near endless:
A 360-degree place to play:
As an early indicator of what’s achievable, the latest iPad Pro, which was released earlier in 2020, included a LIDAR sensor that Apple unveiled in its ‘Hot Lava’ game to showcase how the sensor will help take AR gaming to the next level. No longer will AR games be manually placed just on the floor in front of a user. Now, the entire room can be used as a canvas to create immersive gaming.
More sophisticated ‘try before you buy’:
Occlusion will be a key feature in interior design and retail apps, allowing homeowners to place furniture in their real world spaces before buying. While many ‘try before you buy’ AR experiences are already available, LIDAR means we can now manually place a side table in a room and walk around it to see how it will really look. And LIDAR will be a boon to fashion too. The sensor’s enhanced accuracy enables razor sharp people tracking, meaning clothing and accessories for multiple people in the same field-of-view can be changed with a simple swipe.
Taking AR filters to a whole new level:
LIDAR does away with portrait mode’s lengthy focus time by focusing on a subject pretty much instantaneously. When combined with AR filters, the tracking of a user’s body and head will be more accurate than ever. Just imagine how great those puppy ear filters are going to look now.
New AR ecosystem:
Apple is rumoured to release its much-anticipated Apple Glasses sometime in the not too distant future and when combined with LIDAR, it looks like Apple is set to create its very own AR ecosystem. Much like when Apple launched the iPod and changed the world of music forever, the powerful combo of Apple Glasses and LIDAR could have a tangible impact on people’s everyday lives.
LIDAR is a breakthrough scanner that works both indoor and outdoor, day and night, for humans and objects. Although the technology isn’t new, its gradual implementation into a mobile device is the first of many steps which will help shape the augmented world of tomorrow. There has never been a better time for any brand to get onboard the AR mothership.