New iPad Pro holds the secret to Apple Glasses — and the future of computing

Apple Glasses concept
(Image credit: Martin Hajek/iDropnews)

Step by step, Tim Cook and his team are putting all the pieces in place to introduce The Next Big Thing in 2023 — the Apple Glasses, the device that will move us from the smartphone era into one in where wearables are the key to our information age existence. The latest chess move came out today in the form of the new iPad Pro 2020's secret weapon: a ToF sensor, which Apple calls the LiDAR sensor.

But what the hell is LiDAR, why is Apple calling its ToF LiDAR, and why is it so important for the future Apple AR Glasses — and the future of computing itself? 

What is LiDAR?

iPad Pro 2020 LiDAR scanner

(Image credit: Apple)

How does LiDAR — Light Detection And Ranging — work? 

In a nutshell, a laser — typically infrared — is fired in a pattern across whatever is in front of it. This is what illuminates the scene. A camera then measures the time each fired photon takes to get back to the emitter. 

By measuring the time it takes to come back — hence its name, time of flight of the fired photon — you can calculate the distance between the camera and whatever objects the photons hit. 

Furthermore, you can easily apply algorithms to analyze the pattern deformation, which give you a better understanding of the objects that the LiDAR is measuring and in turn offers data to the AI to identify objects too. 

There are other considerations, such as close light sources and concave surfaces that could confuse the Time-of-Flight sensor, but those could be accounted to with other algorithms.

From the instant measuring of distances between objects and your emitter obtained by the ToF sensor in the iPad — or a Samsung Galaxy S20 Ultra, for that matter — the CPU will be able to obtain precise geometrical 3D data from the world around it. 

Why is Apple calling this LiDAR instead of ToF sensor?

Apple decided to call the iPad’s 3D sensor the “LiDAR sensor” for reasons I fail to comprehend. The fact is that LiDAR is a form of ToF, which is what Samsung and other manufacturers are calling it in their devices. Their sensors are mostly manufactured by Sony, which has 95% of the market share for this kind of technology. In fact, Apple is allegedly using the same Sony ToF sensors, too.

Perhaps Apple, always the master of marketing, did it to differentiate itself from everyone else, knowing that one, everyone is using the term ToF and, two, not many people have a clue of what ToF is. It’s one of those obscure technical terms that get thrown around in spec sheets. 

At least people has been hearing about LiDAR for a while. Many know these are the cameras that help autonomous cars to recognize the world around them and move safely. In fact, Apple has been working with LiDAR for a while on its Project Titan, the infamous Apple Car that never saw the light of day.

Apple also claims on its webpage that NASA is using the same technology in the Mars Rover 2020 and it’s true. Curiosity uses it, too. Just definitely not the sensor Apple is using but a much more sophisticated one. And in fact many of them and for more purposes than terrain navigation.

Does it work well?

During Apple's virtual press preview for the new iPad Pro, Tom's Guide editor in chief Mark Spoonauer got a sneak peek of some of the AR apps that will be available and optimized for the iPad Pro's LiDAR scanner, including a game called Hot Lava that brings (you guessed it) hot lava right into your living room along with a very realistic looking main character.

The Complete Anatomy app was also impressive, as it can show you in real time which muscles the person in front of you are using as they move around. And IKEA Place looked a lot more snappy, as Apple's A12Z Bionic chip makes setting up your virtual room instantaneous. Presumably, the horsepower for the Apple Glasses would reside in your iPhone or iPad, but the glasses themselves would have a LiDAR scanner on board.

Why LiDAR is crucial for the Apple Glasses

The Apple Glasses will need this 3D sensing technology to be working at 110% of its capacity when they a appear sometime in 2023. As the iPad Pro will demonstrate, truly convincing augmented reality applications need to have instant and constant knowledge of its surroundings in order to successfully merge the virtual imagery with the real world.

Knowing how the world around you looks in 3D at all times will allow you to precisely position 3D objects on the world, apply textures to real world objects, and put virtual objects behind physical objects. If you want your new fantastic mixed world to feel real to your eyes, you need all these abilities working constantly at a high refresh rate. This is how a merged reality may look on the Apple Glasses — albeit a dystopian reality:

Reportedly, the iPad Pro ToF has a 120Hz refresh rate, which lines up with the rest of the industry.

It also needs the longest range possible, so you can convincingly build your world in open spaces. Apple claims that the range of the ToF sensor in the iPad Pro is 5 meters. 

By deploying ToF on the iPad Pro — and on the iPhone 12 Pro line by the end of the year — Apple will be building on the foundation it already has going with previous versions of its  augmented reality architecture.

The addition of ToF in real consumer products will allow the company to keep refining its artificial intelligence algorithms in the real world, which is crucial for the Apple Glasses to deliver a flawless experience when they come out in a few years. 

This process of refinement will get further advanced by Apple's rumored AR/VR headset in 2022 —probably geared towards early adopters and developers. That will be the last stage before Cupertino releases its allegedly nimble Apple Glasses, once the components are mature enough and, more importantly, its AR platform has a deep understanding of the world around it.

And that’s the thing that should get you excited about today’s announcement. What you are seeing deployed today will result in some cool games and apps for the iPad and iPhone, but what you are actually witnessing is the ramp up to a new technology world in which our phones will eventually disappear —  replaced by a constellation of wearable devices. 

And the Apple Glasses will be at the center of that new computing revolution.

Jesus Diaz

Jesus Diaz founded the new Sploid for Gawker Media after seven years working at Gizmodo, where he helmed the lost-in-a-bar iPhone 4 story and wrote old angry man rants, among other things. He's a creative director, screenwriter, and producer at The Magic Sauce, and currently writes for Fast Company and Tom's Guide.