What Apple’s current product line up can teach us about their forthcoming AR Glasses

 Photo by  David Švihovec

In the world of modern tech startups, a lean approach to product building is now almost universal. No longer do companies spend years and millions of dollars secretly building a brand new product, only for it to be released and fail spectacularly due to solving a problem that doesn’t exist or developing for a market that is too small.

The Lean Startup approach advocates customer development to validate the problem, and building a minimal viable product which can be released quickly to validate the solution.

Apple’s ‘building blocks’ approach

 Photo by  Iker Urteaga

Photo by Iker Urteaga

There is one company however (the largest in the world, in fact) that thumbs its nose to lean startup, Apple. Year in, year out, they release the latest and greatest new features for iPhones, Watches, MacBooks and more which have been designed and developed behind a shroud of secrecy.

This isn’t to say that Apple don’t take an iterative approach to product development. In fact, they take what Ben Evans from a16z calls a ‘building block’ approach, where the foundations for new features are laid over a number of years.

Take Apple Pay for example. A key building block for ensuring Apple Pay is secure and trusted by customers is Touch ID, the fingerprint recognition system that is integrated directly into the iPhone’s home button.

Whilst Apple Pay wasn’t released until 2014, Touch ID was included with the iPhone 5, a full year earlier. This not only allowed Apple to tout a neat way of unlocking your phone, but also enabled them to test and finesse the feature before releasing Apple Pay successfully a year later.

Apple’s Augmented Reality’s future

Disclosure, before we go any further: I am an Apple follower and customer. Yes, I know other technology companies are working on AR solutions, but this is my article and I only want to focus on Apple right now. 😄👍

At DigitalBridge we are excited to see where Apple ultimately goes with Augmented Reality (AR). Tim Cook has spoken at length about how important he believes AR will be, but has said very little about what that vision will ultimately look like.

Maybe examining the building blocks already in place might help…

ARKit

The obvious component to mention here is ARKit, Apple’s SDK that enables developers to layer on-screen virtual elements over what the camera can see. By releasing ARKit with iOS 11 last year, Apple have been able to let developers loose on AR technology without the need for an AR Glasses product, which is probably still a few years away.

TrueDepth Camera

Last year, one of the iPhone X’s headline features was the TrueDepth Camera - a camera that employs multiple sensors to perceive depth. The main use-case for this (beyond animating a poop emoji) was FaceID, the ability for an iPhone to identify its owner by their facial features alone. This in turn negated the need for a fingerprint sensor, which was handy since the iPhone X did away with the home button.

This depth sensing technology will also be key for any AR Glasses product, which will need to be able to create a 3D map of the user’s environment and track the position of their hands.

Apple Watch

Apple Watch is obviously Apple’s wrist device that is primarily a health-tracker (and a timepiece, I guess).

In a world where a user is wearing AR Glasses, however, her Apple Watch would also make for handy mini-screen and input device. Not to mention the additional sensors that are packed into the Watch. The gyroscope will help movement tracking and the heart-rate monitor might help reduce the amount of zombies if the user is getting a little too scared.

AirPods

AirPods are Apple’s wireless earphones. Apple has always sought to banish the bane of wires, and tangled EarPods became a thing of the past once AirPods were released (at least for those who could afford them).

To create a believable AR experience, the hardware doing all the heavy lifting needs to essentially be invisible. And whilst most people are quick to picture how AR augments what the user sees, the audio is arguably just as important in creating a compelling experience. AirPods are the perfect way to delivery that augment sound. They have no wires to get tangled up in as you move around and come equipped with dual microphones to both listen to the real environment and take voice commands from the user.

The vision

So whilst this is still speculation, one can imagine that Apple’s vision for AR would be:

  • Lightweight (and most certainly beautifully designed) glasses.

  • The glasses would be fitted with a forward-facing version of the TrueDepth camera that will accurately map the environment around the user in real-time.

  • ARKit would be able to determine the position of the floor, walls, objects and lighting sources, enabling virtual objects to be rendered realistically into the scene.

  • Apple Watch would be monitoring the user’s movements and act as a physical control and mini-display.

  • AirPods would use their microphone so the user can listen to real sounds with a realistic virtual soundtrack layered on top. Users would also be able to use voice commands to control the experience.

So what do you think AR will look like in the near future? Are you excited or bored of it already? Let us know.


James Lewis