Take a closer look at one of the newer iPhone 12 Pro models orand you will see a small black dot near the camera lenses, about the same size as the flash. This is the leader sensor and this is a new type of depth detection that can make a difference in a number of interesting ways.
If Apple has its own way, lidar is a term you’ll start to hear a lot now, so let’s split what we know, what Apple will use it for, and where the technology can go next.
What does leader mean?
Lidar means light detection and range and has been around for some time. It uses lasers to repel objects and return to the laser source, measuring the distance by synchronizing the travel or flight of the light pulse.
How does lidar work to feel the depth?
Lidar is a type of in-flight camera. Some other smartphones measure depth with a single light pulse, while a smartphone with this type of lidar technology sends waves of light pulses in a spray from infrared dots and can measure each with its own sensor, creating a field of dots that delineate distances and can “combine” the dimensions of a space and the objects in it. Light pulses are invisible to the human eye, but you can see them with a night vision camera.
Isn’t this like the iPhone’s Face ID?
That is, but with a greater scope. The idea is the same: Applealso fires multiple infrared lasers, but can only operate a few meters away. The rear sensors for the leader of the iPad Pro and iPhone 12 Pro work in a range of up to 5 meters.
Leader is already in many other technologies
Lidar is a technology that is springing up everywhere. It is used for, or . It is used for and . Augmented reality headphones like have similar technologies, mapping the spaces in the rooms before layering 3D virtual objects in them. But it also has a long history.
Microsoft’s old Xbox Depth Monitor,, was a camera that also had an infrared depth scan. In fact, PrimeSense, the company that helped create Kinect technology, . We now have Apple TrueDepth scanners and rear lidar camera sensors.
The iPhone 12 Pro’s camera can work better with lidar
Smartphone flight cameras are typically used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light, up to 6 times faster in low light conditions. Detection of lidar depth will also be used to enhance the effects of the night portrait.
Better focus is a plus, and there’s also a chance the iPhone 12 Pro will add more 3D photo data to images. Although this item has not yet been exposed, Apple TrueDepth’s front, depth-sensitive camera has been used in a similar way with applications.
It will also significantly improve augmented reality
Lidar will allow the iPhone 12 Pro to launch AR applications much faster and build a quick map of the room to add more detail. A lottake advantage of lidar to hide virtual objects behind real ones (called occlusions) and place virtual objects in more complex room maps, such as a table or chair.
But there is additional potential beyond that, with a longer queue. Many companies dream of headphones that combine virtual objects and real ones: AR glasses,,, ,, ,, ,, and and others will rely on the availability of advanced 3D maps of the world to layer virtual objects.
These 3D maps are now being built with special scanners and equipment, almost like the world scan version of these cars on Google Maps. But there is an opportunity for people’s own devices to eventually help crowdsource this information or add additional data on the go. Again, AR headphones like Magic Leap and HoloLens already pre-scan your environment before putting things in it, and Apple’s AR technology, equipped with a leader, works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headphones without the headphone part … and could pave the way for Apple to make its own glasses in the end.
3D scanning can be a killer application
Lidar can be used to entangle 3D objects and rooms and layer photo images on top, a technique called photogrammetry. This could be the next wave of capture technology for practical applications such as, or even social media and journalism. The ability to capture 3D data and share this information with others can open up these lidar-equipped phones and tablets as tools for capturing 3D content. Lidar can also be used without the camera element to measure objects and spaces.
Apple is not the first to explore such technologies on the phone
Google had this same idea in mind when– an early AR platform that was – was created. The advanced array of cameras also has infrared sensors and can delineate rooms, creating 3D scans and depth maps for AR and interior measurement. Google’s Tango phones were short-lived, replaced by computer vision algorithms that measured the depth of the cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.