UX Designer


An Augmented Reality application integrated with hand-tracking technology to help learn Sign Language

Focus: AR Application Development | Accessibility | Mixed Reality Design
Tools: Unity Engine, Blender, and Magic Leap Device
Duration: Oct to Dec 2020

hero image

Problem Statement

Communication gap between Deaf or Hard-of-Hearing and Hearing individuals

Deaf or Hard of Hearing (DHH) individuals choose sign language as their mode of communication, which is different from the spoken language hearing individuals' mode of communication, creating a communication gap between these two communities. As these communities share the same space, like being co-enrolled in academic settings or working as colleagues in the same company, it becomes more than important to reduce the communication gap.

Project Overview

How can we improve the effectiveness of learning sign language using advanced technology?

Learning sign language is quite different from learning a spoken language because the spoken language is based on speech-auditory senses, while sign language is based on visual-manual senses. Sign language, along with facial expressions and lip-movements, heavily depend on hand gestures in a 3-dimensional space. Currently available resources like images and videos are restricted to the 2-dimensional domain, making it difficult to master sign language using these resources. Utilizing Mixed Reality can help us break this 2D barrier.


Mixed Reality can serve as a potential solution to learn sign language

We can look towards using Mixed Reality as a potential solution. The goal of this project is to develop an Augmented Reality (AR) application prototype integrated with Mixed Reality (MR) technologies like hand-tracking. The application would help the user to visualize hand gestures and movement of a particular sign in AR and would perform hand-tracking to provide real-time feedback on whether the user is reproducing the correct sign or not.

To limit the scope for the prototype, this project would focus on helping self-learning British Sign Language (BSL) numbers from 0 to 10.


The application helps users self-learn British Sign Language (BSL) numbers from 0 to 10 by visualizing the signs in AR and the hand-tracking technology provides real-time feedback.

Market Scope

According to Global Market Insights, AR could surge up to

Target Domain

Sign Language is the primary mode of communication for individuals from Deaf and Hard-of-Hearing community

Key Points

A viable solution should provide:

An interactive learning method

Remote and constant availablity

No need for expert supervision

Real-time feedback

User Flow


Home Screen

All the signs will be displayed on the home screen. The users would select the sign they want to learn and the sign would open on the next screen.

View Sign Screen

The selected sign gets displayed. Now the user can go to the previous or next sign in the list.
Now the user can view the hand movement of a particular sign by clicking on the “View Animation” button or the user can practice reproducing the sign.

Practice Sign Screen

If the Practice Sign button is selected, the system would go to the next screen where the user is asked to reproduce that particular sign. While the user is producing the sign, the system would perform gesture recognition to find if the user has reproduced the correct sign or not.


Created Hand Models

To create the desired application, I first needed to make the instructional hands and give them animations. The hand models, their rigging, and animations all were done using Blender 2.9.

Engine and device setup

To display these hand models on the Magic Leap device, the Unity engine was used but before that, I first had to integrate these two technologies. Magic Leap uses Lumin OS, so it was important to include Magic Leap Lumin SDK v0.24.1 and Magic Leap Unity Package v.24.2.

Uploading models on Unity engine

To create the desired application, I first needed to make the instructional hands and give them animations. The hand models, their rigging, and animations all were done using Blender 2.9.

Scenes setup and logic scripts

A plane needed to be defined in order to display signs from the user’s perspective. All the signs were placed on the plane with a text box representing their detail. To display the frame where signs of each hand model was made, a script AnimScript was created.

Magic Leap hand-tracking intergation

Magic Leap’s built-in hand-tracking technology would perform hand tracking and show if the user is performing the correct or wrong sign. If in 3 sec the user produces the correct sign with over 90% KeyPoseConfidence then a success message is shown. Or else a wrong sign message on the screen.

Final Result

Future Scope

Currently, Magic Leap only detected 8 key poses, which limits the scope of the project. I was only able to detect a few of the BSL signs. BSL numbers 0,1,5,8, and 10 can be detected using the Magic leap key poses. While BSL numbers 2,3,4,6,7, and 9 cannot be detected. I tried using key points, however, Magic Leap does not provide a Boolean variable to show if desired key points are available or not.

This project was focused on developing a functional model. In the future, we can consider working on the form of the project and improve the user experience.

The prototype can be implemented to form a complete application that can help users learn various different signs from different languages. The prototype can be further extended to include signs using two-hands. Moreover, the sign’s position can be programmed to be displayed concerning the user’s body in order to help them understand the sign orientation in a better form.