Car Voice Assistant
Seeing the potential of virtual voice assistants within vehicles to create a safer driving experience.
Car Voice Assistant
Voice assistants have proven to be usable and valuable enough to support a large initial adoption. There is a growing potential for virtual voice assistants within vehicles to create a safer driving experience. In this project my team and I explore how we might design for a voice assistant in a vehicle while thinking about drivers safety first.
Project Details
Team » Jessica Yang, Nathalia Kasman
Duration » 8 weeks, 2019
Software » Sketch, Illustrator, Keynote, Premiere Pro
Key Features
Smart Updates
When the system starts up, your voice assistant informs you about any changes or updates. With Smart Updates, drivers will have a better understanding of weather, maintenance, and traffic before their ride begins.
Safety Mode
Drivers switch on Safety Mode, drivers can be reminded of traffic rules and regulations. A safety report will be sent to the driver when their route ends. The report can include information about their average speed, brake usage, and differences in their route.
Accident Care
When things get out of hand, Accident Care will help guide drivers when situations get dangerous. The voice assistant will be walk them through accident protocol out loud and on-screen. Accident Care reminds drivers of small steps that are often forgotten during accidents.
Interviews
We interviewed 8 participants: 4 experienced drivers and 4 new drivers. Before we decided on the 3 key features of Ford’s voice assistant, we conducted interviews with drivers to get a better sense of what drivers enjoy and dislike about the driving experience. We tried to understand where the frustrations are and where a voice assistant may be beneficial.
After gathering all the interviews, we synthesized the data into categories based on different aspects of driving but also the car itself. We were primarily looking at why individuals resorted to using their phones/external stimuli, what they want their car to do, and their emotional state during drives.
Key Insights
We gathered 5 main insights covering 3 broad topics of control, entertainment, and phone usage. Overall, people enjoy having control, resort to external stimulants, and use their phone when convenient inside a car.
Control
Having control over your transportation and its environment is more enjoyable. This includes having control over who going in the car.
Drivers would like cars to understand the car’s environment and adjust or alert drivers accordingly.
Entertainment
Driving can become mundane due to necessary but repetitive actions, so drivers resort to external stimulants for entertainment.
Phone Usage
Cars do not come with places to store/hold phones, so drivers place phones where it is convenient for them.
Phones have become part of the driving experience due to people’s growing reliance on its functions.
Problem Statement
After synthesizing our research and organizing key insights, we landed on this problem statement: How might we improve driving safety by integrating voice-controlled interactions into the driving experience?
Journey Map
New Driver
This is the journey of a new driver’s experience. During their drive, they are able to ask their car about driving rules and route updates. The voice assistant is there to alleviate pain points where a new driver may typically get confused or anxious.
Commuter
This is the journey of a commuter’s experience. Rather than hearing about driving rules, they receive updates about road conditions or traffic on their usual commute.
We looked at both new drivers and commuters because we understand they come from very different levels of experience. A new driver’s pain points vastly differ from an experienced driver who takes the same route everyday.
Iterations
Usability Testing
Round 1 - low fidelity
For our first round of usability testing, we walked drivers through situations using low fidelity objects to mimic the driving environment. First we emulated a car environment through a TV screen while they held a mock steering wheel in their hands. We also used a paper prototype to emulate different screens. Our team split the roles of moderator, voice assistant, and screens.
Round 2 - medium fidelity
During our second round, we asked users to drive a short (and safe) route while interacting with a prototype of the voice assistant. We were seeing how and when drivers were interacting with the voice assistant and how distracting the screens where. Additionally, we asked questions about what they thought about the content.
Usability Insights
1️⃣Users wanted different routes to be defined more clearly.
We initially used “recommended route”, “fastest”, and “safest” which turned out to be confusing. Drivers didn’t know what those labels entailed. For our next iteration we added subtitles to further explain what they meant. Recommended is following their usual commute or most popular route while fastest has the least amount of miles. The safest route avoided busy, confusing roads.
2️⃣Users wanted the voice assistant to solve problems for them, instead of just telling them about the situation.
During Accident Care, drivers felt confused about the information given. The voice assistant was simply stating the situation but it would be more beneficial to expand that into a solution. We had to think about how the car and the voice assistant could ease the driver with new, confusing situations.
3️⃣Some users found the Safety Report to be condescending. They wanted facts instead of suggestions, as proper driving is too subjective.
Our initial safety report gave suggestions on how to improve their next drive. Drivers found the suggestions to be condescending as if the voice assistant was judging their driving skills. We switched it up to be informative to give drivers data about their drive and include comments on positive aspects of the drive.
4️⃣Users wanted minimal/no touch interaction with the interface if possible, but still wanted the ability to turn manually on/off the voice assistant.
The first prototype we created focused too much on the screens and turned out to be more distracting. We solved this by hiding the menu bar while the car is in motion, but slides back up automatically when the car stops. But in doing so, passengers won’t be able to access the GPS, music, or settings. We implemented a pop-up warning that passengers could bypass even when the car is in motion.
Next Steps
In future iterations of this project, we want to explore some topics and insights that we weren’t able to ideate upon due to time constraints. The HMW’s below are one’s we would have liked to explore with more time.
How might we make driving a less monotonous activity?
How might we consider for drivers’ emotional states?
How might we integrate the phone into the assistant?
Additionally, we would have the opportunity to explore conversational design further and fully develop a language for this voice assistant.