AT FIRST SIGHT

Case Study + Passion Project
CLIENT
General Assembly
PROJECT TYPE
Case Study + Passion Project
PROJECT YEAR
2020 - 2021
At First Sight is a mobile app prototype designed to assist individuals with visual impairments by integrating voice commands and hands-free functionality. Using a device’s camera, it provides detailed audio descriptions of the environment, transforming it into an extension of the user’s senses. Inspired by Microsoft’s Seeing AI, this project enhances accessibility with a fully voice-driven experience. Extensive research, including user interviews and competitive analysis, revealed key challenges in existing tools, such as accuracy issues and privacy concerns. By addressing these pain points, At First Sight aims to provide a seamless, intuitive, and empowering solution for visually impaired users. ( Passion Project - Research Based )

A week ago my instructor introduced the class to our first project. 2 weeks in, with the opportunity to choose our topics I felt compelled to explore the road of “Enhancing Voice Data”. The only problem is I had no idea at all what “Voice Data” actually was. After doing user interviews and in depth research into the new world of voice data I created a rapid lo-fi prototype for an app I called At First Sight. The app is designed to let you use your camera as a tool to see things for you as if they’re your eyes. This app not only utilizes voice data by communicating your environment to you in full detail but allows you to use your voice to command the app giving the user a more hands free experience which in return completes the mission of enhancing voice data. This is At First Sight.

THE CHALLENGE

How can I use voice data to help those with visual impairments or blindness?

If you ever used Siri, Google voice, or twitter’s new voice tweets then you have probably had your voice collected as data before. These tools not only help users feel more connected through communication but can serve as a tool to be more hands free.

Roughly 285 million people of all ages are visually impaired, an estimated 39 million of which are blind. Those 50+ years make up 82% of all blind. So that leads me to ask what tools are readily available to those in need.

THE GOAL

Create an app that enhances voice data by allowing the user to be in command with their voice, the app is designed to help those with visual impairments or blindness to make their daily lives easier
I led the design for this project and collaborated with a variety of different users who had previous encounters with voice data.

THE TEAM

the process

Reviewing voice data trends currently happening posed more questions than answers. During my initial search I wanted to understand what voice data was, but when you look up voice data you get a list of companies and verbiage that doesn’t exactly state what it is and how it affects us which led to my first question in my user interviews. “What can you tell me about voice data?”

As we were doing the research into gaining knowledge about voice data I had an epiphany. If you look back at my brainstorming map above you’ll see that visual impairments crossed our mind. Thinking about how those in need could benefit from this tool that was just scratching the surface enticed us. Thus At first site was born and now we were on the path to discovery.

The first step into our research was competitive analysis we wanted to know exactly what tools were currently out and what strategies they implemented for our friends in need. One iOS Application stood out to us; Seeing AI. this artificial intelligence based Microsoft platform gave us the perfect insight and inspiration. Being able to use your devices camera to identify things like people and objects was golden. The only thing is how do we enhance voice data from there.

Talking to the people.

Starting with limited info we took to the streets and started the in depth research with interviews. We asked them questions like what role voice data played in their lives and how much of an impact it made on things like communication and completing daily tasks.

Out of the 8 interviews 2 of them consisted of people that either had visual impairments/blindness or was closely related to someone who was and through them we discovered ways voice data can influence their lives in a positive way.

Discovery

This is what led us to our idea of having an app that focuses on helping users by creating a hands free system. By implementing the seeing AI strategy from Microsoft and allowing users to directly control the app with their Voice or touch we can help those with visual impairments or blindness. This is how we enhance voice data.

Some important things we discovered.

The tools users utilized the most like voice to text or translation apps had a problem with accuracy. Users found themselves having to go back and correct the issues found in their texts which takes away from being hands free.

The current tools and resources built for those in need were few and far between. There wasn’t a one stop shop for all your needs

Large amounts of data is being stored and poses a concern for “Privacy”. how does this affect users ability to communicate. and how could we insure users would feel safe?

prototyping +++

I started off with early sketches of what I thought would be the main feature of the app and from there I collaborated with users to find out what was preferred and what would also be effective at being useful.

From there I created a simple user flow detailing the process of the first screen and main function of the app. I discovered that my first flow was pretty spot on for what the user expected the experience to be.

WITH THE NEW INSIGHT I DRAFTED MY FIRST PAPER PROTOTYPE THAT ALSO INCLUDED NOTATIONS OF WHAT WE DESIRED.

The goodies; The Outcome

Those with visual impairments or blindness have trouble with seeing certain content on their devices, allowing them to use their voice will give them the power they need to complete daily task with more ease.

From all the useful insights we gathered doing our research we created a rapid, lo-fi, and clickable prototype.

There’s only a few things our prototype differed from the paper prototype and most has to do with layout of buttons and pages.

I really enjoyed working on this important project, and I’m glad I had a chance to present the work at our class presentations. Hopefully apps like this will lead us to a better

experience with hands free tools for everyone especially those visually impaired or blind.