If you want to be able to understand sign language, you would need to take a course on it and learn how to read it. Since that isn’t necessarily something everyone might want to learn, it might be tricky for people who are hard of hearing or who might have a speech impairment to communicate with others.
However, maybe in the future we won’t have to learn how to read sign language. Thanks to the work of researchers at the University at Buffalo, they have modified a pair of noise-cancelling headphones to be able to “see” and translate American Sign Language when paired with a smartphone.
The setup, dubbed as SonicASL, uses Doppler technology to detect tiny fluctuations in acoustic soundwaves that happen as a result of someone signing using their hands. So far, the system has been proven to be incredibly effective at 93.8% in tests performed indoor and outdoors. However, it should be noted that only 42 words were used in this test, and the researchers found that when trying to construct simple sentences, its effectiveness dropped to 90.6%.
There is still a lot of work that needs to be done, but the researchers are optimistic that the proof-of-concept has the potential to improve communication between the deaf and hearing populations. The researchers are also hoping that in the future, SonicASL could also be used to read facial expressions.