Simon Gosling, Chief Strategy Officer at Quiet Mark, former Bidstack CMO and New Digital Age’s monthly columnist, has over 25-years’ experience in the industry and was previously Futurist at Unruly.
Last weekend, Apple announced that its next 10 years will include self-driving cars, computer glasses, and a much faster iPhone. Termed ‘an iPhone replacement’, we can expect to see a first version of its computer glasses, with a headset style release by 2022, then a second device, expected in 2023, more akin to sunglasses, with a thick frame to house a battery and processors.
Will the promise of computer glasses made by Google Glass back in 2014 finally become a reality within the next couple of years?
Certainly one of the main hurdles that Google Glass stumbled over back then was having to perform via 3G. How fast is 5G? Let’s use an example. How long would it take to download the two-hour-long “Guardians of the Galaxy” movie? In 3G it took 26 hours, in 4G only 6 minutes, in 5G it will be 3.6 seconds!
A third technology that is progressing in leaps and bounds is object recognition/ computer vision. Most of us are already familiar with Google Lens and its incredible visual search capabilities. If not, try it! Simply go to the Google search bar on your mobile and tap on that square lens icon with the blue dot in the middle, next to the microphone icon.
It will switch your device to camera, and suddenly everything it sees, from flowers, to celebrities’ faces, to clothes to certain buildings is recognised and searchable. Asos is a retailer using object recognition to best effect with its ‘Style Match’ visual search tool, enabling consumers to snap or upload styles and find similar or same items available from the online retailer.
A 3D maps game changer
But it’s the combination of Computer Vision and 3D Maps which is going to be the massive game changer. We’ve already seen the beginnings of this with the April 2019 launch of Snapchat’s Landmarker tech. This allowed creators to bring landmarks across the world to life using Snapchat’s Lens Studio’s augmented reality tools, with many more added last August.
Because Snapchat has millions of user generated photos of these landmarks in its cloud, it has been able to use machine learning to teach its AI system to recognise these landmarks the moment our camera sees them and then feed us content, precisely anchored to those objects. And it is more precise than GPS which drifts and bounces off the metal frame works of buildings, which explains why that blue dot on Google Maps, doesn’t quite follow our exact movements.
The future of entertainment
So what kind of entertainment and massaging might we be able to expect when we combine these new computer glasses, 5G and advancements in computer vision? Well, Nexus Studios and Scape recently gave us a real taste of the future, today, when they created the AR based Samsung – AT&T 5G Fan Experience. Partnering with Samsung and AT&T, Nexus transformed AT&T stadium (the world’s first 5G enabled stadium) into the world’s first ARCloud Ready venue.
During the Dallas Cowboys NFL season, they brought fans closer to the action and their heroes via hyper-accurate, persistent, never-seen-before AR experiences that can be viewed from any seat in the stadium in real-time, including photo-realistic holograms, games, live stats and more — all powered by 5G on a phone. That kind of experience is precisely the type of experience that we can expect to see in those Apple glasses of 2023.
Finally, where might this all be heading? And what impact might this ‘always on’ messaging have on our well being and mental health? Well, not wishing to get too Black Mirror dystopian, but hoping to provide food for thought, I recommend viewing Hyper Reality, a 6 minute sci-fi short film, made in 2016 by Keiichi Matsuda.
It presents a provocative and kaleidoscopic new vision of the future, where physical and virtual realities have merged, and the city is saturated in media.
As much as it’s exciting to embrace new technologies and trends, it’s important to keep mindful of when to disengage with them. I really like how my iWatch prompts me to stop and take time to focus on my breathing and measures my activity. Let’s hope that our iGlasses, (or whatever they’ll be called), look after us in similar ways.