The new iPad Pro is the first Apple device equipped with a time-of-flight laser sensor that works similar to the TrueDepth camera in terms of depth mapping, only better. It’s the key stepping stone to Apple’s rumored augmented reality headset, and a new video shows it in action.

Depth points captured by the LiDAR scanner, along with data from onboard cameras and motion sensors, can be passed to new depth frameworks in iPadOS for processing, resulting in a more detailed understanding of the user’s actual environment. And that, girls and boys, is going to power miles better augmented reality experiences than what we have now.

Watch the video below from YouTube channel iPhonedo to see the infrared dots of the new iPad Pro’s LiDAR scanner in action (the fun bit starts at mark 3:59).

The LiDAR scanner in the iPad Pro uses so-called time-of-flight calculations to determine distance by measuring how long it takes laser light to reach an object and reflect back.

Apple claims it’s so advanced that NASA is using the LiDAR scanner for the next Mars landing mission, but it’s used for a variety of other purposes, too. As an example, Apple Maps vehicles have sophisticated LiDAR scanners that map the surrounding environment in 3D.

Apple explains how its Measure app takes advantage of the new sensor:

The LiDAR scanner improves the Measure app, making it faster and easier to automatically calculate someone’s height, while helpful vertical and edge guides automatically appear to let users more quickly and accurately measure objects.

The Measure app also now comes with Ruler View for more granular measurements and lets you save a list of all measurements, complete with screenshots for future use.

Existing AR apps get all of the benefits of the LiDAR scanner “for free”, including perks like instant placement, as well as improved motion capture and people occlusion. “Using the latest update to ARKit with a new Scene Geometry API, developers can harness the power of the new LiDAR Scanner to unleash scenarios never before possible,” the company notes.

For comparison’s sake, here are the infrared dots of Apple’s TrueDepth camera.

Of course, a version of the sensor that has been engineered to fit in the thin and light iPad Pro is less powerful than what NASA or Apple Maps vehicles are using, but it’s still much, much better and faster than the TrueDepth camera on the iPhone X and later.

Aside from being more accurate and reliable, it’s capable of mapping surrounding objects up to 5 meters away whereas the TrueDepth camera works reliably from up to twenty inches away.

Both systems work both indoors and outdoors, as well as in complete darkness, but the LiDAR scanner “operates at the photon level at nano-second speeds,” according to Apple.