Designing the Voice Experience
Natural language technology just might be the next major paradigm shift to disrupt technology, but how will it affect UX design?
Perhaps you’ve watched a similar scene unfold. A toddler grabs an iPhone off a table and begins frantically flipping through it on a quest to find their favorite app. Maybe she can’t yet communicate in full sentences and yet she has this innate understanding of how to navigate touchscreen technology. If you handed her an old laptop, she might immediately move her hands to the screen.
“You see children now that are raised on tablets and touchscreens, they expect to also be able to touch the TV and interact with it,” says Christina Apatow, VP of Client Solutions at Speaktoit, inc., the company behind the talking personal assistant app Assistant and API.Ai.
Apatow’s team is working on UX technologies that make voice communications between humans and computers more organic. Just as we’re perfecting tactile technologies, she says voice is already becoming a larger part of experience design (XD). It won’t be long, she says, before kids distinctively pick up that iPhone and ask it to perform a certain task.
“I think in the future, children are going to be trying to talk to all their devices and tell them what to do even if they’re not voice enabled. It’s going to be similar where people are raised on this expectation that they can just interact with [a device] in the way that’s easiest for them,” Apatow said.
The ABCs of Natural Language Technologies
More companies are beginning to incorporate natural language technologies into their products. Since Speaktoit, inc. launched its artificial intelligence API in 2014, more than 25,000 developers have signed up for the platform. Voice experiences are being built into smart home technologies, warehouse management systems, automobile systems, wearables, consumer electronics, assistive technologies and more.
“We started off using a mouse and keyboard, then the next major paradigm shift was to touchscreens,” said Apatow. “I think [natural language technology] is going to be the next paradigm shift that really disrupts all technology and how users expect to interact with things.”
Look Who’s Talking: Conversational UX Design
Ensuring a smooth conversational UX happens largely outside of a user’s tactile engagement. For a user to have a successful experience, the technology needs to understand user behavior and respond in authentic ways. This is different from conventional UX design and relies solely on function over visual cues.
“When you’re doing something the traditional way, say you have an app that has different navigation and you need to click a couple buttons to get somewhere, there’s inherently this hierarchy of what you can access through a certain number of clicks,” Apatow said. “With a voice experience, that’s all completely mitigated. You’re just activating it and then asking for whatever you want, and so everything should be accessible to you.”
To be effective, experience designers need to anticipate the types of things a user is going to want to do or ask, and then the system needs to be programmed and “trained” to respond accordingly.
“These experiences are predictive as well as extremely natural, and you’ll be able to interact with them as you would any other human,” Apatow said.
How Conversational UX Learns to Understand Users
Most smartphone users have communicated with artificial intelligence already through Siri and similar applications, but what’s different about natural language technologies is their ability to understand context through training.
This technology works by tying user requests to specific actions. For example, if you were building a music player, you could incorporate queries like “play Madonna” or “start playing the Beatles” and the application would understand that this variance in linguistics pertains to the same request.
“We try to mimic exactly how humans interact so that it remembers for a certain period of time what you were talking about, and then it has a natural decay over a certain number of queries or a certain period of time,” Apatow said. “It will eventually forget so that only what you’re talking about most recently will have active context. That’s sort of how the human memory also works.”
Voice Experiences Are Growing Up Fast
Apatow says that while this may sound futuristic, people in charge of designing experiences need to understand that this is part of the immediate future. “Even since last year, I’ve seen companies’ priorities shift from not having voice experience in the roadmap to now it’s their top priority,” she said.
“Interacting with voice and gesture, that should be something that can be ready within the next year so. I think we are seeing that technology has come to a point where all this stuff is feasible and now it’s just a matter of integrating everything, incorporating it and inspiring companies to have it in their products.”
She recognizes that not every company will be quick to embrace natural language voice experiences, but in her opinion it’s the way of the future.
Falling In Love With the Idea of Artificially Intelligent User Experiences
In Spike Jonze’s 2013 film Her, a writer develops a loving relationship with his artificially intelligent operating system. While Apatow doesn’t claim this exact thing is happening with her company’s assistant app, she did mention that many users see these conversational experiences as more than just a utility.
“A lot of people interact with our assistant like it’s their friend,” she said. “They ask it questions. They’ll ask it to marry them. They really try to get to know it and it seems like care about it almost, almost as if it is their friend or another human. They’ll name it, they can customize it, and they can tell it jokes. They can interact with it largely like it’s another human.”
As technological experiences become more aligned with human experiences, one can’t help but wonder how this may affect the human condition as well. Perhaps some day soon children will hold their imaginary friends in the palm of their hands.