•  
  •  
 
Communications of the Association for Information Systems

Abstract

Although our digitalized society is able to foster social inclusion and integration, there are still numerous communities having unequal opportunities. This is also the case with deaf people. About 750,000 deaf people only in the European Union and over 4 million people in the United States face daily challenges in terms of communication and participation, such as in leisure activities but more importantly in emergencies too. To provide equal environments and allow people with hearing handicaps to communicate in their native language, this paper presents an AI-based sign language translator. We adopted a transformer neural network capable of analyzing over 500 data points from a person’s gestures and face to translate sign language into text. We have designed a machine learning pipeline that enables the translator to evolve, build new datasets, and train sign language recognition models. As proof of concept, we instantiated a sign language interpreter for an emergency call with over 200 phrases. The overall goal is to support people with hearing inabilities by enabling them to participate in economic, social, political, and cultural life.

COinS
 

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.