Artificial intelligence is being talked about almost daily. If the uses are multiple – and sometimes not very useful – today here is a technological advance which could improve the lives of a large number of individuals. The use of AI in the medical world is booming: a team of researchers from Florida Atlantic University has just developed a model capable of decoding American Sign Language (ASL) in real time. This breakthrough could profoundly transform the lives and interactions with technology of hard of hearing and deaf people.
Advertising, your content continues below
Breaking communication barriers with AI
Using computer vision technology, researchers have successfully developed an AI model that can transcribe ASL alphabet gestures into text with 98% accuracy. To achieve this result, the researchers trained a model on 29,820 static images of hand gestures. They combined these photos with motion tracking technology based on 21 landmarks, placed in strategic locations for sign language.
At this point, the model already achieves remarkable accuracy. The next step for the University of Florida researchers is to expand the model's training hardware to make it even more reliable. The team also says it is working on optimizing the system so that it can run on less powerful machines such as smartphones. The ultimate goal of this research? Allow the deployment of a real-time sign language translation tool in order to break down communication barriers between hearing-impaired individuals, deaf people and the rest of the population. There would be many possible uses for such a system and we can already imagine how useful it could be for the education or health sectors.
Advertising, your content continues below
Related News :