But why is Apple making a big deal with lidar and what can it do when you do?? It is a term that you will hear a lot now. So let’s break down what we know, what Apple will be using it for, and where the technology could go next.
What does lidar mean?
Lidar stands for light detection and removal and has been around for some time. It uses lasers to pick off objects and return to the source of the laser. The distance is measured by controlling the movement or flight of the light pulse.
How does lidar work to capture depth?
Lidar is a type of time-of-flight camera. Some other smartphones measure depth with a single pulse of light, while a smartphone with this type of lidar technology sends out waves of light pulses in a spray of infrared points and can measure each with its sensor, creating an array of points that map and distance can “network” the dimensions of a room and the objects it contains. The light pulses are invisible to the human eye, but you can see them with a night vision camera.
Isn’t that like Face ID on iPhone?
It is, but with greater range. The idea is the same: Apple’salso shoots out a number of infrared lasers, but can only work up to a few meters away. The rear lidar sensors of the iPad Pro and iPhone 12 Pro work within a range of up to 5 meters.
Lidar is already in many other technologies
Lidar is a technology that is popping up everywhere. It is used for, or . It is used for and . Augmented Reality headsets like that Use similar technologies to map spatial spaces before inserting 3D virtual objects into them. But it also has a pretty long history.
Microsoft’s old Xbox accessory with depth sensing, thewas a camera that also had infrared deep scanning. Indeed, PrimeSense, the company that helped develop Kinect technology, . Now we have Apple’s TrueDepth and Lidar camera sensors for scanning faces.
The iPhone 12 Pro camera could work better with lidar
Time-of-flight cameras on smartphones are typically used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light, which is up to 6 times faster in low light. Lidar depth measurement is also used to enhance the effects of night portrait mode.
Better focus is a plus, and there’s a chance the iPhone 12 Pro could add more 3D photo data to images as well. Although this item has not yet been set, Apple’s front-facing TrueDepth camera with depth sensing has been used in a similar way for apps.
It will also greatly improve augmented reality
Lidar allows the iPhone 12 Pro to launch AR apps much faster and create a quick map of a room to add more details. Lotsuse lidar to hide virtual objects behind real objects (so-called occlusion) and to place virtual objects in more complicated spatial arrangements such as on a table or chair.
But beyond that, there is additional potential with a longer tail. Many companies dream of headsets that combine virtual and real objects: AR glasses,, , , , and and others will rely on having advanced 3D maps of the world onto which virtual objects can be layered.
These 3D maps are now created with special scanners and devices, much like the world scan version of these Google Maps cars. However, there is a chance that your own devices may help gather this information or add additional on-the-fly data. Again, AR headsets like Magic Leap and HoloLens scan your environment before you put it into storage, and Apple’s lidar-equipped AR technology works the same way. In this sense, the iPhone 12 Pro and iPad Pro are like AR headsets without a headset part … and could pave the way for Apple to manufacture their own glasses at some point.
3D scanning could be the killer app
Lidar can be used to mesh 3D objects and spaces and overlay photographic images, a technique known as photogrammetry. This could be the next wave of practical capture technologiesor even social media and journalism. The ability to capture 3D data and share that information with others could turn these lidar-equipped phones and tablets into tools for capturing 3D content. Lidar can also be used without the camera element to capture measurements for objects and spaces.
Apple isn’t the first to research such technologies on a phone
Google had the same idea in mind when– an early AR platform that was – was created. The advanced camera array also had infrared sensors and could map rooms, create 3D scans and depth maps for AR, and measure interiors. Google’s Tango-equipped phones were short-lived and replaced with computer vision algorithms that performed the valued depth sensing on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.