Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Technology https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Apple wants to make lidar a big deal for the iPhone 12 Pro and beyond. What it is and why it matters

Apple wants to make lidar a big deal for the iPhone 12 Pro and beyond. What it is and why it matters



apple-iphone12pro-back-camera-10132020.jpg

The lidar sensor of the iPhone 12 Pro – the black circle in the lower right corner of the camera – opens up AR possibilities.

Apple

Apple is taking advantage of lidar, a technology that is completely new to the iPhone 12 family, especially for iPhone 12 Pro and iPhone 12 Pro Max. (The iPhone 12 Pro is on sale now, with the iPhone 12 Pro Max with Pro Max in a few weeks.)

Take a closer look at one of the newer iPhone 12 Pro models or the latest iPad Proand you will see a small black dot near the camera lenses, about the same size as the flash. This is the leader sensor and this is a new type of depth detection that can make a difference in a number of interesting ways.

If Apple has its own way, lidar is a term you’ll start to hear a lot now, so let’s split what we know, what Apple will use it for, and where the technology can go next.

What does leader mean?

Lidar means light detection and range and has been around for some time. It uses lasers to repel objects and return to the laser source, measuring the distance by synchronizing the travel or flight of the light pulse.

How does lidar work to feel the depth?

Lidar is a type of in-flight camera. Some other smartphones measure depth with a single light pulse, while a smartphone with this type of lidar technology sends waves of light pulses in a spray from infrared dots and can measure each with its own sensor, creating a field of dots that delineate distances and can “combine” the dimensions of a space and the objects in it. Light pulses are invisible to the human eye, but you can see them with a night vision camera.

ipad-pro-ar

The iPad Pro, released in the spring, also has a lidar.

Scott Stein / CNET

Isn’t this like the iPhone’s Face ID?

That is, but with a greater scope. The idea is the same: Apple Face-enabling TrueDepth camera also fires multiple infrared lasers, but can only operate a few meters away. The rear sensors for the leader of the iPad Pro and iPhone 12 Pro work in a range of up to 5 meters.

Leader is already in many other technologies

Lidar is a technology that is springing up everywhere. It is used for self-driving cars, or assisted driving. It is used for robotics and The drones. Augmented reality headphones like HoloLens 2 have similar technologies, mapping the spaces in the rooms before layering 3D virtual objects in them. But it also has a long history.

Microsoft’s old Xbox Depth Monitor, Kinect, was a camera that also had an infrared depth scan. In fact, PrimeSense, the company that helped create Kinect technology, was acquired by Apple in 2013. We now have Apple TrueDepth scanners and rear lidar camera sensors.

XBox_One_35657846_03.jpg

Remember Kinect?

Sarah Tu / CNET

The iPhone 12 Pro’s camera can work better with lidar

Smartphone flight cameras are typically used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light, up to 6 times faster in low light conditions. Detection of lidar depth will also be used to enhance the effects of the night portrait.

Better focus is a plus, and there’s also a chance the iPhone 12 Pro will add more 3D photo data to images. Although this item has not yet been exposed, Apple TrueDepth’s front, depth-sensitive camera has been used in a similar way with applications.

lidar-powered-snapchat-lens.png

Snapchat now allows AR lenses with the help of the iPhone 12 Pro leader.

Snapchat

It will also significantly improve augmented reality

Lidar will allow the iPhone 12 Pro to launch AR applications much faster and build a quick map of the room to add more detail. A lot Apple’s AR updates on iOS 14 take advantage of lidar to hide virtual objects behind real ones (called occlusions) and place virtual objects in more complex room maps, such as a table or chair.

But there is additional potential beyond that, with a longer queue. Many companies dream of headphones that combine virtual objects and real ones: AR glasses, on which Facebook works,, Qualcomm,, Snapchat,, Microsoft,, Magic jump and most likely Apple and others will rely on the availability of advanced 3D maps of the world to layer virtual objects.

These 3D maps are now being built with special scanners and equipment, almost like the world scan version of these cars on Google Maps. But there is an opportunity for people’s own devices to eventually help crowdsource this information or add additional data on the go. Again, AR headphones like Magic Leap and HoloLens already pre-scan your environment before putting things in it, and Apple’s AR technology, equipped with a leader, works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headphones without the headphone part … and could pave the way for Apple to make its own glasses in the end.

occipital-canvas-ipad-pro-lidar.png

3D room scan from the Occipital’s Canvas app, activated by the iPad Pro depth sensor. Expect the same for the iPhone 12 Pro, and maybe more.

Occipital

3D scanning can be a killer application

Lidar can be used to entangle 3D objects and rooms and layer photo images on top, a technique called photogrammetry. This could be the next wave of capture technology for practical applications such as Home improvement, or even social media and journalism. The ability to capture 3D data and share this information with others can open up these lidar-equipped phones and tablets as tools for capturing 3D content. Lidar can also be used without the camera element to measure objects and spaces.

google-tango-lenovo-1905-001.jpg

Remember Google Tango? There was also a sense of depth.

Josh Miller / CNET

Apple is not the first to explore such technologies on the phone

Google had this same idea in mind when Tango Project – an early AR platform that was only on two phones – was created. The advanced array of cameras also has infrared sensors and can delineate rooms, creating 3D scans and depth maps for AR and interior measurement. Google’s Tango phones were short-lived, replaced by computer vision algorithms that measured the depth of the cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.


Now playing:
Watch this:

iPhone 12, iPhone 12 Mini, Pro and Pro Max are explained


9:16


Source link