What is the LiDAR technology that NASA uses and is present in the iPhone 12 Pro?


Subscribe to receive the most important news

Apple, Apple yesterday revealed a new technology called LiDAR on its new iPhone 12 series, which included 4 models of different sizes and capabilities, and the iPhone 12 Pro and iPhone 12 Pro Max phones had a new technology called: LiDAR.

Apple says that LiDAR is a sensor sensor called LiDAR Scanner, which is a LiDAR scanner in camera for night mode images and provides six times faster automatic adjustment in low light as LiDAR technology enables advanced capabilities of the Pro camera system, including instant auto adjustment of the lens in Low lighting and thanks to LiDAR’s 3D map detail, Night Mode portraits can be captured through the Wide Camera. The Neural Engine can now put the focus on your photography while creating an attractive background to support the augmented reality experience more attractively.

Apple says that NASA is using LiDAR technology for the next landing mission on the surface of Mars, while the iPhone 12 Pro in turn use this scanner with the same LiDAR technology to measure the time it takes for light to reach an object and be reflected from it and thus can create a three-dimensional map of the area that is formed In it, whatever it is.

Since it’s ultra-fast and accurate, augmented reality apps can now turn any room into a close-to-reality jungle or help you choose a new shoe size you want to buy.

The LiDAR scanner measures absolute depth by calculating how long it takes invisible light rays to travel from the transmitter to the objects and back to the receiver. LiDAR technology works with the depth frameworks in iOS 14 to produce a massive amount of high-resolution data that covers the entire camera’s field of view.

The scanner continues to measure scene details and improve the 3D map by sending light rays within a nanosecond to set new rules for the world of augmented reality.

LiDAR technology can instantly recognize surfaces around it, so augmented reality applications can start directly by analyzing the scene and creating custom experiences.

LiDAR technology can place everything in its place so that it appears real, whether on the back of the chair or on the table, due to its ability to identify the details of all surfaces in the room. This is what makes augmented reality apps smarter. For example, if you use augmented reality to display a landscape inside the living room, you will find that the transplant grows on the floor of the room as its edges extend into pieces of furniture, in perfect proportion to the space in which you are presenting your experience.

This more accurate 3D map enables iPhone 12 Pro to better identify objects to see which ones appear before others. Thus, if the augmented reality characters are running at the entrance to your home, they will pass accurately behind the car and in front of the tree, to always feel that you are at the heart of this experience in all its details.

Video to watch: click here

  • The situation in Egypt

  • Injuries


  • Recovered


  • Mortality



Please enter your comment!
Please enter your name here