If Apple gets its way, lidar is a term you’ll hear a lot now, so let’s take a look at what we know, what Apple is going to use it for, and where the technology could go. And if you’re curious about what it’s doing now, I’ve also spent some hands-on time with the tech.
What does lidar mean?
Lidar stands for light detection and range, and has been around for a while. It uses lasers to peck objects and return to the laser’s source, measuring distance by timing the travel or flight of the light pulse.
How does lidar work to feel depth?
Lidar is a kind of time-of-flight camera. Some other smartphones measure depth with a single pulse of light, while a smartphone with this type of lidar technology emits waves of light pulses in a beam of infrared dots and can measure each with its sensor, creating a field of points that map distances. and can “mesh” the dimensions of a room and the objects in it. The light pulses are invisible to the human eye, but you could see them with a night vision camera.
Isn’t this like Face ID on iPhone?
It is, but with a longer reach. The idea is the same: Apple’salso shoots an array of infrared lasers, but can only work up to a few feet away. The lidar sensors on the back of the iPad Pro and iPhone 12 Pro work up to a range of 5 meters.
Lidar is already in many other technology
Lidar is a technology that is popping up everywhere. It is used for, or It is used for and Augmented reality headsets such as the have similar technology, mapping room spaces before putting virtual 3D objects into them. But it also has quite a long history.
Microsoft’s old Xbox depth-sensing accessory, the, was a camera that also had infrared depth scanning. PrimeSense, the company that helped create the Kinect technology, Now we have Apple’s TrueDepth face scanning and rear lidar camera sensors.
The iPhone 12 Pro camera works better with lidar
Time-of-flight cameras on smartphones are mostly used to improve focus accuracy and speed, and the iPhone 12 Pro does the same. Apple promises better focus in low light, up to six times faster in low light. Lidar depth detection is also used to enhance the effects of Night Portrait mode. So far it has an impact: readfor more.
Better focus is a plus, and there’s also a chance that the iPhone 12 Pro can add more 3D photo data to images too. While that element hasn’t been explained yet, Apple’s front-facing, depth-sensing TrueDepth camera has been used in a similar fashion with apps, and third-party developers could dive in and develop some wild ideas. It’s already happening.
It also vastly improves augmented reality
With Lidar, the iPhone 12 Pro can launch AR apps much faster, and build a quick map of a room to add more detail. A lot ofuse lidar to hide virtual objects behind real objects (called occlusion), and place virtual objects in more complicated room assignments, such as on a table or chair.
I’ve tried it on one, Hot Lava, which already uses lidar to scan a room and all its obstacles. I was able to place virtual objects on stairs and hide things behind real objects in the room. Expect many more AR apps to add lidar support like this one for richer experiences.
But there is even more potential, with a longer tail. Many companies dream of headsets that combine virtual objects and real objects: AR glasses,and and others will rely on advanced 3D maps of the world to stack virtual objects.
Those 3D maps are now being built with special scanners and equipment, almost like the world scan version of those Google Maps cars. But there’s a possibility that people’s own devices could eventually help crowdsource that information or add additional data on the fly. Again, AR headsets like Magic Leap and HoloLens scan your environment before they put things in, and Apple’s lidar-equipped AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headsets without the headset part … and
3D scanning could be the great app
Lidar can be used to fit 3D objects and rooms together and superimpose photo images, a technique called photogrammetry. That could be the next wave of capture technology for practical uses like, or even social media and journalism. The ability to capture 3D data and share that information with others could open these lidar-equipped phones and tablets as tools for capturing 3D content. Lidar could also be used without the camera element to obtain measurements for objects and spaces.
I tried some early onon the iPhone 12 Pro with varying degrees of success (3D Scanner App, Lidar Scanner and Record3D), but they can be used to scan objects or map rooms with surprising speed. Scanning lidars’ effective range of 4.6 meters is enough to reach most rooms in my home, but larger outdoor areas require more exercise. Again, Apple’s front-facing TrueDepth camera is already doing similar things at close range.
Apple isn’t the first to explore technology like this on a phone
Google had the same idea in mind back then– an early AR platform that was — was founded. The advanced camera array also had infrared sensors and could map spaces, create 3D scans and depth maps for AR and for measuring indoor spaces. Google’s Tango-equipped phones were short-lived and were replaced by computer vision algorithms that performed estimated depth sensing on cameras without requiring the same hardware. But Apple’s iPhone 12 Pro looks like a significantly more advanced successor, with lidar capabilities extending to cars, AR headsets, and a whole lot more.