Apple’s new feature uses the LiDAR technology and its wide-angle camera to help people with visual impairments / Digital Information World

LiDAR stands for Light Detection and Ranging. It is a system through which a laser emits pulsed light and a receiver that receives that pulsed light measures the time it took for the light to reflect. Apple’s latest iPhones 12 and 2 Pro have the LiDAR technology built in their system, and recently, Apple has introduced a new feature for its latest beta version. This feature uses the iPhone’s LiDAR tech and its wide-angle camera to help visually impaired people, as it can detect the presence of people around them, and can even guide them about the distance other people are from these users.

Judging the distance between people is a basic visual task which people with visual impairments have difficulty with. Especially in the current times when there is a pandemic going on and people are advised to maintain a distance of 6 feet at least from each other. Now, if someone is blind or has any other form of visual impairment, maintaining this distance is a difficult task. So, Apple’s new feature in the upcoming iOS stable versions will help these people judge the distance between them and other people around them. It will also help them immediately sense the presence of someone getting closer to them.

This feature is currently in beta testing, but it will help these visually impaired people in many other ways too.

This new feature will be a part of the iPhone’s Magnifier, and there are several different ways that it will alert its users about other people in their surroundings.

First off, it uses sound correspondence that can help the visually impaired user to get alerted if someone moves in the surrounding, whether they are moving closer or away. The other person’s movement will be detected by the camera and the direction of the person in relation to the camera’s view will correspond with the sound in stereo.

Secondly, users can set specific tones for set distances. For instance, if they set a tone for 6 feet and someone is detected to be outside the 6 feet radius, the user will hear the tone that will let them know that someone is 6 feet away from them. If that person moves in closer to them, they will hear another tone which will guide them that someone is less than 6 feet away from them. This will help them maintain social distancing.

Now, if someone has a hearing problem along with being visually impaired, there is a feature with a haptic pulse that gets faster if someone gets closer. And the frequency and beats of this haptic pulse will let the visual and hearing-impaired person know that someone is close by.

For people who do not have a total loss of vision, there is an arrow that will help them detect the presence and distance of other people on their iPhone’s screen.

One important thing to remember is that the wide-angle camera does not function in darkness, so, to gauge the presence and distance of other people in the surroundings, the presence of light is mandatory.

Source link


Popular posts from this blog

لماذا يحتاج الذكاء الاصطناعي دائمًا إلى الرقابة البشرية ، لا يهم مدى ذكائه

Reminder: StackSkills Unlimited Lifetime Access for $59

70 في المائة من جميع مجالات الويب فشل في التجديد بعد عام واحد من الشراء