When the iPhone 12 was launched, one of the features included was a LiDAR sensor. Apple had touted the inclusion of this technology as being able to help with the camera and autofocusing, even when there is low-light. However, a recent video uploaded by the BBC has shown that it can do more than just that.
The video focuses on tech and how it can help improve the lives of people living with certain disabilities, such as visual impairments. In the video, it shows how scarily accurate the LiDAR sensor is on the iPhone when paired with AI software, where it can actually detect and identify objects and people around it.
It shows how blind reporter Lucy Edwards walked around her town with the iPhone 12’s camera pointing in front of her. The LiDAR sensor then manages to pick up on things around her, and was able to tell her how many people are in front of her, objects in store windows, and even counts down to when people are walking towards her.
We’re not sure if this was an intended use for the LiDAR sensor, although given that Apple is quite proud of its accessibility features, perhaps it was. Either way, it is a rather impressive demonstration and could change the way people with disabilities lead their lives and use technology.
Filed in iPhone, Iphone 12 and Social Hit.
. Read more about