Development of a Mobile Application for Interactive Sign Language Learning Using AI-Powered Gesture Recognition and Gamification

Authors

  • Samom Pensri Lawana Bachelor of Engineering Program in Computer Engineering, Panyapiwat Institute of Management, Thailand

Keywords:

Sign Language Learning, Mobile Application, Artificial Intelligence, Gesture Recognition, Gamification

Abstract

Communication barriers remain a significant challenge for the deaf and hard-of-hearing (DHH) community, primarily due to the limited knowledge of sign language among the general population. Existing sign language learning methods often lack interactivity, accessibility, and real-time feedback, making the learning process less effective and engaging. This research aims to develop an innovative mobile application for sign language learning that integrates artificial intelligence (AI)-powered gesture recognition, gamification, and multilingual support to enhance user experience and learning outcomes. The study employs a user-centered design approach to ensure accessibility for both DHH individuals and hearing users. AI-driven gesture recognition is incorporated to provide real-time feedback, allowing users to practice sign language with automated corrections. Gamification techniques, including quizzes, challenges, and progress tracking, are used to boost engagement and motivation. Additionally, the application supports multiple sign languages, addressing the limitation of existing apps that focus on only one variant. The effectiveness of the application is evaluated through usability testing, user feedback, and pre- and post-learning assessments. Results are expected to show improvements in sign language proficiency, higher engagement levels, and increased accessibility compared to traditional learning methods. The discussion will highlight the advantages of AI-based feedback, the impact of gamification on motivation, and potential areas for future improvement. This research contributes to the advancement of sign language education by leveraging modern technology to create a more interactive, inclusive, and efficient learning platform. By addressing current limitations, the proposed application has the potential to bridge communication gaps and promote greater social inclusion for the DHH community.

Downloads

Download data is not yet available.

References

Abascal, J., & Nicolle, C. (2005). Moving towards inclusive design guidelines for socially and ethically aware HCI. Interacting with Computers, 17(5), 484–505.

Baber, H. (2020). Determinants of students’ perceived learning outcome and satisfaction in online learning during the pandemic of COVID-19. Journal of Education and E-Learning Research, 7(3), 285–292.

Bauman, H.-D. L., & Murray, J. J. (2014). Deaf gain: Raising the stakes for human diversity. U of Minnesota Press.

Brentari, D. (2010). Sign languages. Cambridge University Press.

Camarero, L., & Oliva, J. (2019). Thinking in rural gap: mobility and social inequalities. Palgrave Communications, 5(1), 1–7.

Cherrett, T., Wills, G., Price, J., Maynard, S., & Dror, I. E. (2009). Making training more cognitively effective: Making videos interactive. British Journal of Educational Technology, 40(6), 1124–1134.

Drigas, A. S., Vrettaros, J., Stavrou, L., & Kouremenos, D. (2004). E-learning Environment for Deaf People in the E-commerce and New Technologies Sector. WSEAS Transactions on Information Science and Applications, 1(5), 1189–1196.

Fernández-López, Á., Rodríguez-Fórtiz, M. J., Rodríguez-Almendros, M. L., & Martínez-Segura, M. J. (2013). Mobile learning technology based on iOS devices to support students with special education needs. Computers & Education, 61, 77–90.

Geers, A. E., Nicholas, J. G., & Sedey, A. L. (2003). Language skills of children with early cochlear implantation. Ear and Hearing, 24(1), 46S-58S.

Jafri, R., & Khan, M. M. (2018). User-centered design of a depth data based obstacle detection and avoidance system for the visually impaired. Human-Centric Computing and Information Sciences, 8, 1–30.

Looi, C., Seow, P., Zhang, B., So, H., Chen, W., & Wong, L. (2010). Leveraging mobile technology for sustainable seamless learning: A research agenda. British Journal of Educational Technology, 41(2), 154–169.

Marschark, M., & Hauser, P. C. (2012). How deaf children learn: What parents and teachers need to know. OUP USA.

Marschark, M., & Knoors, H. (2012). Educating deaf children: Language, cognition, and learning. Deafness & Education International, 14(3), 136–160.

Martins, P., Rodrigues, H., Rocha, T., Francisco, M., & Morgado, L. (2015). Accessible options for deaf people in e-learning platforms: technology solutions for sign language translation. Procedia Computer Science, 67, 263–272.

Motiwalla, L. F. (2007). Mobile learning: A framework and evaluation. Computers & Education, 49(3), 581–596.

Naranjo-Zeledón, L., Peral, J., Ferrández, A., & Chacón-Rivas, M. (2019). A systematic mapping of translation-enabling technologies for sign languages. Electronics, 8(9), 1047.

Parvez, K., Khan, M., Iqbal, J., Tahir, M., Alghamdi, A., Alqarni, M., Alzaidi, A. A., & Javaid, N. (2019). Measuring effectiveness of mobile application in learning basic mathematical concepts using sign language. Sustainability, 11(11), 3064.

Paudyal, P. (2020). Towards Building an Intelligent Tutor for Gestural Languages using Concept Level Explainable AI. Arizona State University.

Pizzo, L. (2016). d/Deaf and hard of hearing multilingual learners: The development of communication and language. American Annals of the Deaf, 161(1), 17–32.

Rodríguez, D., Carrasquillo, A., & Lee, K. S. (2014). The bilingual advantage: Promoting academic development, biliteracy, and native language in the classroom. Teachers College Press.

Stokoe Jr, W. C. (2005). Sign language structure: An outline of the visual communication systems of the American deaf. Journal of Deaf Studies and Deaf Education, 10(1), 3–37.

Toofaninejad, E., Zaraii Zavaraki, E., Dawson, S., Poquet, O., & Sharifi Daramadi, P. (2017). Social media use for deaf and hard of hearing students in educational settings: a systematic review of literature. Deafness & Education International, 19(3–4), 144–161.

Von Agris, U., Zieren, J., Canzler, U., Bauer, B., & Kraiss, K.-F. (2008). Recent developments in visual sign language recognition. Universal Access in the Information Society, 6, 323–362.

Vygotsky, L. S. (2012). Thought and language. MIT press.

Willoughby, L., & Sell, C. (2019). Studying a Sign Language. Sign Language Studies, 19(3), 453–478.

Downloads

Published

2024-07-30

How to Cite

Lawana, S. P. (2024). Development of a Mobile Application for Interactive Sign Language Learning Using AI-Powered Gesture Recognition and Gamification. Idea: Future Research, 2(2), 72–80. Retrieved from https://idea.ristek.or.id/index.php/idea/article/view/25