We’re all familiar with the ways in which voice assistants like Siri and Alexa have become a staple of household life: harried cooks everywhere can relate to the question, “Alexa, how many tablespoons are in a cup?”

But recent developments point to voice assistants doing more than answering straightforward questions. Technologies like Google Assistant can actually offer life advice. And according to researchers from the Universitat Pompeu Fabra in Barcelona, it doesn’t hurt to listen: they’ve done a study that indicates embracing life coaching from a voice assistant can improve well-being. What does that mean for AI? For human reliance on AI?

The Study

Human life coaches help people pinpoint their priorities and goals in life, from professional to personal, then work to identify what steps individuals can take to achieve those goals. In the study out of Universitat Pompeu Fabra, researchers wanted to determine if voice assistants could perform a similar service. Thirty people participated in the study, and each participant spent three sessions with NORKIA, a virtual life coach the research team had created. The voice assistant, which used both a male and female voice for English speakers, and a female voice for Spanish speakers, could be controlled using a simple interface on an iPhone app.

The sessions with NORKIA were strikingly similar to working with a human coach. Participants in the study first worked with the virtual life coach to determine what area of their lives they wanted to enhance, then identified “core values” related to that area and a plan to achieve those goals. A goal-forming concept called SMART (specific, measurable, achievable, relevant, and timely) drove the plan. At one point, NORKIA even asked participants to visualize a world in which they achieved their goals—a common technique employed by human coaches.

Results were striking: study leader Dr. Laura Aymerich-Franch cited a “significant increase in PGI (personal growth) and SLS (life satisfaction) values,” as well as a notable drop in negative emotions (PANAS) compared to pre-session state of mind. She adds, “This suggests that the coaching program contributed positively to psychological well-being, life satisfaction and personal growth.”

Why This Matters

Just under half of American homes contain at least one smart assistant, making tools like Alexa a realistic resource for many people. The study results indicate that these assistants might, increasingly, serve more and more functions in our lives. But it’s important to note that this doesn’t mean technology will simply replace human input. Far from it: Dr. Aymerich-Franch is careful to state that seeking advice from a voice assistant is not the same as working with a human coach or therapist; rather, she is excited about the tool being a useful complement. As she notes, “[Conversational agents] can serve to help eliminate attitudinal barriers that still exist when seeking therapeutic support to improve psychological well-being.” In this case, then, Alexa might make us more open to the services human beings can offer.

The Issue of Tone

At Moonshot, we believe the study underscores the importance of putting people at the center of AI applications such as voice products. One area in which humans can provide invaluable input as AI services are developed? That of tone. We have written about how product designers must think beyond functional trust and design with emotional trust in mind. Tone plays an important part in emotional trust.

It’s not enough for a voice assistant to perform flawlessly; people also have to feel at ease when working with AI. As noted, voice assistants are becoming more prevalent. But for people to actually use them (as opposed to buying them and ignoring them), the smart assistant has to foam the runway by making sure the tone is appropriate for the audience at hand. For the most part, humans understand how to modulate tone: someone coaching an adult to pursue an exercise regimen might effectively use a tough-love tone. But a child could require gentler encouragement—akin to what a grandparent can offer. Humans can make those adjustments, and for AI to be most useful, that flexibility and nuance in tone must be replicated somehow. Tone isn’t just a matter of how to reach out to different generations, either. Cultures around the world have different hot buttons and parameters for good manners—and many of these parameters are driven by tone.

One of the next frontiers of designing for trust is to adapt AI-based voice products for different people—from different age groups, from different cultures—a process otherwise known as AI localization. By training AI with local data, AI localization ensures that a product like Alexa will use the right tone in a specific market.

To be clear, we’re not just talking about translation. Language translation is part of AI localization, but equally important are the localized experiences that resonate with a particular demographic. Anyone who’s traveled abroad understands that engaging with people in a culture different than your own means being sensitive to the different ways people communicate—verbally, certainly, but also using gesture and tone. One study comparing English North Americans and Chinese participants found that Chinese individuals were influenced more strongly by vocal cues unrelated to the task. Even volume can take on different meanings. Volume is used by British English speakers to connote anger, for example, while Indian English speakers ratchet up volume simply to command attention, as noted in this article.  

All these factors must be taken into consideration as AI designers work to establish trust between humans and technology in areas beyond getting that tablespoon question answered. It cannot be overstated: AI needs humans to ensure the AI experience feels good.

How To Get Started

Tools exist to help brands make that human/technology connection a positive one. Consider the Mindful AI Canvas, a collaboration tool that product designers use to examine what people want from AI. One dimension of the AI Canvas — Emotions, Intentions, and Value — focuses on three tentpole issues:

  • What feelings do we hope to evoke with AI?
  • What is the ideal relationship that might exist between a human audience and an AI experience?
  • How does the AI experience align with and promote a brand’s objectives?

Voice assistants — with the help of human counterparts and tools like the Mindful AI Canvas — are already making the leap to more empathic, intuitive services. How might AI fit into your brand strategy going forward? Contact us if you’d like an outside perspective – we help brands figure out how to succeed with ambient technologies such as voice all the time. It’s what we love to do.

Bitnami