A week ago my instructor introduced the class to our first project. 2 weeks in, with the opportunity to choose our topics I felt compelled to explore the road of “Enhancing Voice Data”. The only problem is I had no idea at all what “Voice Data” actually was. After doing user interviews and in depth research into the new world of voice data I created a rapid lo-fi prototype for an app I called At First Sight. The app is designed to let you use your camera as a tool to see things for you as if they’re your eyes. This app not only utilizes voice data by communicating your environment to you in full detail but allows you to use your voice to command the app giving the user a more hands free experience which in return completes the mission of enhancing voice data. This is At First Sight.
If you ever used Siri, Google voice, or twitter’s new voice tweets then you have probably had your voice collected as data before. These tools not only help users feel more connected through communication but can serve as a tool to be more hands free.
Roughly 285 million people of all ages are visually impaired, an estimated 39 million of which are blind. Those 50+ years make up 82% of all blind. So that leads me to ask what tools are readily available to those in need.
Reviewing voice data trends currently happening posed more questions than answers. During my initial search I wanted to understand what voice data was, but when you look up voice data you get a list of companies and verbiage that doesn’t exactly state what it is and how it affects us which led to my first question in my user interviews. “What can you tell me about voice data?”
As we were doing the research into gaining knowledge about voice data I had an epiphany. If you look back at my brainstorming map above you’ll see that visual impairments crossed our mind. Thinking about how those in need could benefit from this tool that was just scratching the surface enticed us. Thus At first site was born and now we were on the path to discovery.
The first step into our research was competitive analysis we wanted to know exactly what tools were currently out and what strategies they implemented for our friends in need. One iOS Application stood out to us; Seeing AI. this artificial intelligence based Microsoft platform gave us the perfect insight and inspiration. Being able to use your devices camera to identify things like people and objects was golden. The only thing is how do we enhance voice data from there.
Starting with limited info we took to the streets and started the in depth research with interviews. We asked them questions like what role voice data played in their lives and how much of an impact it made on things like communication and completing daily tasks.
Out of the 8 interviews 2 of them consisted of people that either had visual impairments/blindness or was closely related to someone who was and through them we discovered ways voice data can influence their lives in a positive way.
This is what led us to our idea of having an app that focuses on helping users by creating a hands free system. By implementing the seeing AI strategy from Microsoft and allowing users to directly control the app with their Voice or touch we can help those with visual impairments or blindness. This is how we enhance voice data.