Visual system interprets sign languages

Spanish sign language is used by over 100,000 people with hearing impairments and is made up of hundreds of signs. CVC-UAB researchers Sergio Escalera, Petia Radeva and Jordi Vitria selected over twenty of these signs to develop a new visual interpretation system which allows deaf people to carry out consultations in the language they commonly use.

Signs can vary slightly depending on each user. Project researchers took this into account during the trials carried out with different people to help the system "become familiarised" with this variability. The signs recognised by the system were programmed to allow deaf people to maintain a basic conversation, including asking for help or directions. "For them it is a non artificial way of communicating and at the same time they can engage with people who do not speak sign language since the system translates the symbols into words in real time," Sergio Escalera said.

The hardware includes a which records image sequences when it detects the presence of a user wanting to make a consultation. A and automatic learning system detects face, hand and arm movements, as well as any screen scrolling, and incorporates these into a which identifies each movement with the word associated with the sign.

One of the aspects worth highlighting is the ability to adapt the system to any other , since the methodology used is general. The system would only need to be reprogrammed with the signs used in that specific language. The amount of signs the system can recognise is also scalable, although researchers do admit that new data will increase the difficulty in differentiating them.

Applications such as the one developed by CVC-UAB researchers require extreme precision in the identification phase and are very difficult to configure given that the surroundings in which they will be used include changes in light and shadow, different physiognomies and speeds at which the signs are formed.

Other similar projects have been developed in the past. However, most of them failed or were not reliable enough because of the high complexity of variabilities in uncontrolled surroundings. For this project to succeed it was necessary to establish a fixed point in which individuals formed the signs and avoid having different focus points when recording.

The system was recently presented as a prototype in the final phase of a European project and researchers are already working on new project phases, such as using two cameras with the aim of recognising even more complex signs and complementing information with facial characteristics. To carry this out researchers worked in close collaboration with several members of the Catalan Federation of , FECOSA.

Provided by Universitat Autonoma de Barcelona

Citation: Visual system interprets sign languages (2010, June 2) retrieved 24 April 2024 from https://phys.org/news/2010-06-visual-languages.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Sign language cell phone service created

0 shares

Feedback to editors