Abstract

AR to learn sign language

AR Design & Development

Designed and developed a Augmented Reality prototype to assist people to self-learn Sign Language.

Duration Feb 2022- May 2022
Team: Solo
AR tutor to teach sign language

Problem

Learning sign language, a 3D spatial language, using 2D resources.

The COVID-19 lockdown emphasized the challenges of learning sign language virtually due to its intricate three-dimensional nature, including hand and finger movements, body positioning, and specific motions. Existing digital resources, limited to 2D screens, struggle to convey these nuances effectively.

Illustration of a person learning sign language on a 2D screen

Project Goal

Extended Reality, crafted in the 3D, can potentially overcome the barriers.

A 3D augmented human, extending into the person's space, that the person can zoom in/out, adjust sign speed, rotate, and switch between front-view and top-view can enhance comprehension of hand orientation. Having a hands-free solution can helpful to practice sign language while freely moving within the space.

Illustration of a man using AR headset to learn sign language.

Project Demo

In this project, I created an AR prototype on Magic Leap headset, utilizing Mixed Reality (MR) technology. The app includes a 3D augmented human model teaching sign language through hand gestures and movements.

Video demo of the Abstract application on Magic Leap headset

Design Process

The Abstract project involved in depth research about users and technologies. Consequently, stitching feasible technical and creative aspect together.

Empathize & Define
User research
Pain points
Personas
User flow
Research & Ideate
Market analysis
Prioritization
Technology feasibility
Design
3D modeling
Animations
User Interface
Development
Unity Engine compatibility
Functionality C# scripting
SDKs + libraries
Test
Usability testing
Feedback analysis
Data analysis

Project Breakdown

Creating 3D augmented human

The first step in creating the app was designing the 3D augmented human character. Blender 2.9 and the MB-LAB 1.7.8 plugin were used for meticulous crafting, including hair and clothing design and modifications within Blender.

3D human created and rigged in Blender
Sign animations

To animate American Sign Language signs, the 3D character is rigged and animated using Blender. Motion animations are applied and saved in FBX file format.

'Text911' feature enabled
Engine + Device setup

To display the 3D character on Magic Leap, Unity engine is set up with essential SDKs including Magic Leap Lumin SDK v0.24.1 and Magic Leap Unity Package v.24.2, ensuring compatibility with Magic Leap's Lumin OS.

Importing the 3D model and sign animations in Unity
Functionalities

Character functionality, such as zoom, sign speed, rotation, and view adjustments, is implemented via a C# script in Unity. An animator controller connects to the sign FBX file, seamlessly assigning signs to the 3D character.

User control functionalities
User Interface

The user interface including a control panel and onboarding screens designed on Figma to facilitate interactions and guide users through the self-learning process of sign language independently, eliminating the need for external assistance.

Onboarding and control panel UI

Reflection

My journey with the Mixed Reality project has been both challenging and exciting, revealing immense potential to address longstanding issues. However, technical constraints, like the bulky form factor, hinder its portability and reliability. Despite limitations, the project ignites enthusiasm for future advancements that could overcome these challenges and unleash even greater potential.