قالب وردپرس درنا توس
Home / Tips and Tricks / The iPhone 12 Pro camera uses lidar. What it is and why it matters

The iPhone 12 Pro camera uses lidar. What it is and why it matters



apple-iphone12pro-back-camera-10132020.jpg

The lidar sensor of the iPhone 12 Pro – the black circle at the bottom right of the camera unit – opens up AR possibilities.

Apple

The iPhone 12 and 12 Pro are now on special offerHowever, one of the main differences between the Pro model and the non-Pro model this year is depth sensing technology. If you have one of the new iPhone 12 Pro models or the latest iPad ProNear the camera lenses, you’ll see a small black dot about the size of the flash. This is a lidar sensor. Apple was optimistic about lidar to add depth sensing and new augmented reality capabilities to its pro-end tablets and phones. It could also be very helpful with camera focus.

But why is Apple making a big deal with lidar and what can it do when you do? Buy the iPhone 12 Pro or iPhone 12 Pro Max? It is a term that you will hear a lot now. So let’s break down what we know, what Apple will be using it for, and where the technology could go next.

What does lidar mean?

Lidar stands for light detection and removal and has been around for some time. It uses lasers to pick off objects and return to the source of the laser. The distance is measured by controlling the movement or flight of the light pulse.

How does lidar work to capture depth?

Lidar is a type of time-of-flight camera. Some other smartphones measure depth with a single pulse of light, while a smartphone with this type of lidar technology sends out waves of light pulses in a spray of infrared points and can measure each with its sensor, creating an array of points that map and distance can “network” the dimensions of a room and the objects it contains. The light pulses are invisible to the human eye, but you can see them with a night vision camera.

ipad-pro-ar

The iPad Pro, released in spring, also has lidar.

Scott Stein / CNET

Isn’t that like Face ID on iPhone?

It is, but with greater range. The idea is the same: Apple’s TrueDepth camera to activate face recognition also shoots out a number of infrared lasers, but can only work up to a few meters away. The rear lidar sensors of the iPad Pro and iPhone 12 Pro work within a range of up to 5 meters.

Lidar is already in many other technologies

Lidar is a technology that is popping up everywhere. It is used for self-driving cars, or assisted driving. It is used for robotics and Drones. Augmented Reality headsets like that HoloLens 2 Use similar technologies to map spatial spaces before inserting 3D virtual objects into them. But it also has a pretty long history.

Microsoft’s old Xbox accessory with depth sensing, the Kinectwas a camera that also had infrared deep scanning. Indeed, PrimeSense, the company that helped develop Kinect technology, was acquired by Apple in 2013. Now we have Apple’s TrueDepth and Lidar camera sensors for scanning faces.

XBox_One_35657846_03.jpg

Do you remember the Kinect?

Sarah Tew / CNET

The iPhone 12 Pro camera could work better with lidar

Time-of-flight cameras on smartphones are typically used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light, which is up to 6 times faster in low light. Lidar depth measurement is also used to enhance the effects of night portrait mode.

Better focus is a plus, and there’s a chance the iPhone 12 Pro could add more 3D photo data to images as well. Although this item has not yet been set, Apple’s front-facing TrueDepth camera with depth sensing has been used in a similar way for apps.

Lidar-powered-Snapchat-Lens.png

Snapchat already activates AR lenses with the lidar of the iPhone 12 Pro.

Snapchat

It will also greatly improve augmented reality

Lidar allows the iPhone 12 Pro to launch AR apps much faster and create a quick map of a room to add more details. Lots Apple’s AR updates in iOS 14 use lidar to hide virtual objects behind real objects (so-called occlusion) and to place virtual objects in more complicated spatial arrangements such as on a table or chair.

But beyond that, there is additional potential with a longer tail. Many companies dream of headsets that combine virtual and real objects: AR glasses, is processed by Facebook, Qualcomm, Snapchat, Microsoft, Magic jump and most likely Apple and others will rely on having advanced 3D maps of the world onto which virtual objects can be layered.

These 3D maps are now created with special scanners and devices, much like the world scan version of these Google Maps cars. However, there is a chance that your own devices may help gather this information or add additional on-the-fly data. Again, AR headsets like Magic Leap and HoloLens scan your environment before you put it into storage, and Apple’s lidar-equipped AR technology works the same way. In this sense, the iPhone 12 Pro and iPad Pro are like AR headsets without a headset part … and could pave the way for Apple to manufacture their own glasses at some point.

occipital-canvas-ipad-pro-lidar.png

A 3D space scan from Occipital’s canvas app made possible by depth detection lidar on the iPad Pro. Expect the same for the iPhone 12 Pro and maybe more.

Occipital

3D scanning could be the killer app

Lidar can be used to mesh 3D objects and spaces and overlay photographic images, a technique known as photogrammetry. This could be the next wave of practical capture technologies handymanor even social media and journalism. The ability to capture 3D data and share that information with others could turn these lidar-equipped phones and tablets into tools for capturing 3D content. Lidar can also be used without the camera element to capture measurements for objects and spaces.

google-tango-lenovo-1905-001.jpg

Do you remember Google Tango? It also had depth sensing.

Josh Miller / CNET

Apple isn’t the first to research such technologies on a phone

Google had the same idea in mind when Tango project – an early AR platform that was only on two phones – was created. The advanced camera array also had infrared sensors and could map rooms, create 3D scans and depth maps for AR, and measure interiors. Google’s Tango-equipped phones were short-lived and replaced with computer vision algorithms that performed the valued depth sensing on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.


Running:
Look at that:

iPhone 12, iPhone 12 Mini, Pro and Pro Max explained


9:16


Source link