Mask Group (5).png

Break down the communication barrier

Overview

In the summer of 2019, I attended a wedding with my best friend. He had grown up in a hard-of-hearing household, so out of necessity learned ASL at a young age. At the wedding, a vast majority of the guests were hard-of-hearing or deaf and spoke to each other with ease throughout the night. I myself was now the outsider - the one unable to communicate with those around me and not being able to decipher their method of communication. That was an incredibly eye-opening experience and I decided I wanted to learn American Sign Language.

I understood, even for just an evening, how it felt to have the ability of thoughtless communication taken away. It became my goal and passion to be able to break the barrier of communication (and help others do the same), with the 3.57 million Deaf Canadians.

qs1.png

The Problem

Learning a new language always seems like a daunting and overwhelming task. ASL is unique in the sense that it requires visual guides and examples and is often not an offered language on major language-learning apps. While in a conversation using ASL, if someone forgets a sign for a word, the conversation has to be stopped completely and they need to communicate, "I forgot the sign for ____". This creates the problem of the conversation being derailed.

Group 1.png

Goals

  • Quickly use mid-conversation to prevent breaking the conversational flow 

  • Be provided an example of the sign quickly and efficiently  

Group 4.png
Group 5 (1).png

Research

In order to fully understand how comprehension of language works in relation to ASL, I read copious articles on how this is mainly done. The key insights I found were visual guides, real-life human examples, putting the new vocabulary into context, and frequent use of the language. These were all things I kept in mind throughout the design process. 

To gain better insight on how the flow and accessibility of the application worked, I gathered three people I know to be currently learning ASL and two people who are Deaf and fluent in ASL. While ultimately the main target users are hearing individuals learning ASL, I thought that insight from Deaf users would be helpful and provide ideas and solutions I had not considered. 

All three of the hearing individuals I tested with and spoke to found the flow easy to understand and use. They were all able to easily navigate from the main screen to the dictation screen and see the example of the word. One of the Deaf users I spoke to actually gave me the idea to include a type option for the word, as Deaf people cannot speak with clarity. I also found that due to the small screen real estate of the Apple Watch, some of my CTAs were too small and needed refinement. Lastly, an insight I gained was that the original 'Active Listening' state was not clear enough. 

Group 6.png

How It Works

REPLACE FINGERSPELLING

While in the middle of a conversation as someone learning American Sign Language, if you forget the sign for what you want to say, the current solution is to sign, "I forgot the sign for ______" and fingerspell out the word. This is cumbersome and also detracts from and abruptly halts the conversation. With QuickSign, if a sign is forgotten, simply choose between 'Spell' or 'Dictate'. 

step1.png

HUMAN EXAMPLE

Once the word is inputted, a real-life gif of how to perform the sign is showed to the user. By providing a human example, the possibility of misunderstanding and therefore communication within the in-person conversation is eliminated. 

step2.png

CREATE YOUR OWN DICTIONARY

Provide users with the ability to save signs which will create a dictionary over time that can be referenced later.

step3.1.png