iPhone 12 Pro: behind The black dot under the camera
Apple on Tuesday unveiled the iPhone 12 Pro. Among the three cameras on the back of a black dot is hidden now. The technology behind it.
Apple announced on Tuesday evening the iPhone 12 and in the same four variants. In addition to the standard model, there is a smaller Mini variant, two Pro models, with better equipment. What distinguishes the models from each other, you can read here.
One of the main differences is to be found in the camera. While Apple uses for the base models, once again, the combination of Wide and ultra – wide-angle, get the two Pro models in addition to an improved Telephoto lens.
The Lidar Sensor in the iPhone 12 Pro
If you look at the camera of the two Pro models in more detail, is also striking: the ultra wide-angle lens, the flash light including a small, black point. This is the so-called Lidar Sensor. Lidar is the acronym for “Light Detection and Ranging”, a Radar related method for optical distance and speed measurement. Unlike the Radar uses, however, laser beams can not be perceived by the human eye. The Sensor is celebrated in the spring, already in the iPad Pro (2020) its Premiere.
Since the speed of light is steady, you can determine the Sensor exactly, how long the light needs to hit an object and return. From these data is billed together with the cameras and the motion sensors, the distance to the object. This process happens thousands of times per second, so that the receiver can draw in this case, the iPhone – resulting in an image with different Depths. This requires a lot of computing power, the A14-Chip delivers. The technology works both indoors and outdoors and at distances over several meters.
Better night photos, and AR
For most of the users and the largest Use in photography will be. Apple says that the iPhone 12 Pro delivers better images in low-light conditions, and Lidar plays a crucial role. The auto-focus in Low-Light situations, such as in dim light to be up to six times faster than before.
With the new technology, however, can be realized not only low effects, but also virtual objects can be precisely positioned in space. Apple has focused for years on the so-called augmented reality (Augmented Reality) and more and more Apps implement such functions. Snapchat, for instance, offers now new effects with which virtual objects can be placed in the real space. Since the technology is now in the iPhone catchment, it is likely to be only a matter of time before other App providers with AR applications add.
You can also read:
Interview with App developer: Why Augmented Reality is the next big thing for Apple