قالب وردپرس درنا توس
Home / Tips and Tricks / Lidar on the iPhone 12 Pro: It’s cool now, but it gets so much cooler

Lidar on the iPhone 12 Pro: It’s cool now, but it gets so much cooler



01-iphone-12-pro-2020

The iPhone 12 Pro’s lidar sensor – the black circle at the bottom right of the camera unit – opens up AR possibilities and more.

Patrick Holland / CNET

Apple is optimistic about lidar, a technology that is brand new in the iPhone 12 family, specifically for the iPhone 12 Pro and iPhone 12 Pro Max. Take a good look at one of the new iPhone 12 Pro models, or the most recent iPad Pro, and you’ll see a small black dot next to the camera lenses, about the same size as the flash. That’s the lidar sensor, and it’s a new type of depth sensing that can make a difference in a number of interesting ways.

Read more: IPhone 12’s lidar technology does more than just enhance photos. Check out this cool party trick

If Apple gets its way, lidar is a term you’ll hear a lot now, so let’s take a look at what we know, what Apple is going to use it for, and where the technology could be headed. And if you’re curious about what it’s doing now, I’ve also spent some hands-on time with the tech.

What does lidar mean?

Lidar stands for light detection and range and has been around for a while. It uses lasers to peck objects and return to the laser’s source, measuring distance by timing the travel or flight of the light pulse.

How does lidar work to feel depth?

Lidar is a kind of time-of-flight camera. Some other smartphones measure depth with a single pulse of light, while a smartphone with this type of lidar technology emits waves of light pulses in a beam of infrared dots and can measure each with its sensor, creating a field of points that map distances. and can “mesh” the dimensions of a room and the objects in it. The light pulses are invisible to the human eye, but you could see them with a night vision camera.

Isn’t this like Face ID on iPhone?

It is, but with a longer reach. The idea is the same: Apple’s Face ID activating TrueDepth camera also shoots an array of infrared lasers, but can only work up to a few feet away. The rear lidar sensors on the iPad Pro and iPhone 12 Pro work up to a range of 5 meters.

Lidar is already in many other technology

Lidar is a technology that is popping up everywhere. It is used for self-driving cars, or assisted driving. It is used for robotics and drones. Augmented reality headsets such as the HoloLens 2 have similar technology, mapping room spaces before putting 3D virtual objects in them. But it also has quite a long history.

Microsoft’s old Xbox depth-sensing accessory, the Kinect, was a camera that also had infrared depth scanning. PrimeSense, the company that helped create the Kinect technology, was acquired by Apple in 2013. Now we have Apple’s TrueDepth face scanning and rear lidar camera sensors.

The iPhone 12 Pro camera works better with lidar

Time-of-flight cameras on smartphones are mostly used to improve focus accuracy and speed, and the iPhone 12 Pro does the same. Apple promises better focus in low light, up to six times faster in low light. Lidar depth detection is also used to enhance the effects of Night Portrait mode. So far it has an impact: read our review of the iPhone 12 Pro Max for more.

Better focus is a plus, and there’s also a chance that the iPhone 12 Pro can add more 3D photo data to images too. While that element hasn’t been explained yet, Apple’s front-facing depth-sensing TrueDepth camera has been used in a similar fashion with apps, and third-party developers could dive in and develop some wild ideas. It’s already happening.

It also vastly improves augmented reality

Lidar allows the iPhone 12 Pro to launch AR apps a lot faster, and build a quick map of a room to add more detail. A lot of Apple’s AR updates in iOS 14 use lidar to hide virtual objects behind real objects (called occlusion), and place virtual objects in more complicated room assignments, such as on a table or chair.

I’ve tried it on one Apple Arcade game, Hot Lava, which already uses lidar to scan a room and all its obstacles. I was able to place virtual objects on stairs and hide things behind real objects in the room. Expect many more AR apps to add lidar support like this one for richer experiences.

lidar-powered-snapchat-lens.png

Snapchat’s next wave of lenses will start applying depth sensing using the iPhone 12 Pro’s lidar.

Snapchat

But there is even more potential, with a longer tail. Many companies dream of headsets that combine virtual objects and real objects: AR glasses, which is being worked on by Facebook, Qualcomm, Snapchat, Microsoft, Magic Jump and most likely Apple and others will rely on advanced 3D maps of the world to stack virtual objects.

Those 3D maps are now being built with special scanners and equipment, almost like the world scan version of those Google Maps cars. But there’s a possibility that people’s own devices could eventually help crowdsource that information or add additional data on the fly. Again, AR headsets like Magic Leap and HoloLens scan your environment before they put things in, and Apple’s lidar-equipped AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headsets without the headset part … and could pave the way for Apple to eventually make its own glasses.

occipital-canvas-ipad-pro-lidar.png

A 3D room scan of Occipital’s Canvas app powered by lidar with depth sensing on the iPad Pro. Expect the same for the iPhone 12 Pro, and maybe more.

Occipital

3D scanning could be the great app

Lidar can be used to fit 3D objects and rooms together and superimpose photo images, a technique called photogrammetry. That could be the next wave of capture technology for practical uses like home improvement, or even social media and journalism. The ability to capture 3D data and share that information with others could open these lidar-equipped phones and tablets as tools for capturing 3D content. Lidar could also be used without the camera element to obtain measurements for objects and spaces.

I tried some early on 3D scan apps with lidar on the iPhone 12 Pro with varying degrees of success (3D Scanner App, Lidar Scanner and Record3D), but they can be used to scan objects or map rooms with surprising speed. Lidar’s scanning effective range of 4 meters is enough to reach most rooms in my house, but larger outdoor spaces require more exercise. Again, Apple’s front-facing TrueDepth camera is already doing similar things at close range.


Now playing:
Look at this:

Our in-depth review of the iPhone 12 and 12 Pro


13:48

Apple isn’t the first to explore technology like this on a phone

Google had the same idea in mind then Project Tango – an early AR platform that was only on two phones – was created. The advanced camera array also had infrared sensors and could map spaces, create 3D scans and depth maps for AR and for measuring interior spaces. Google’s Tango-equipped phones were short-lived and were replaced by computer vision algorithms that performed estimated depth sensing on cameras without needing the same hardware. But Apple’s iPhone 12 Pro looks like a significantly more advanced successor, with lidar capabilities that extend to cars, AR headsets, and a whole lot more.




Source link