Amazon Alexa Teachable AI

  • Thread starter Thread starter Bapun
  • Start date Start date
  • Replies Replies: Replies 0
  • Views Views: Views 585

Bapun

Staff member
Community Manager
Joined
3 Nov 2010
Messages
27,928
Solutions
9
Reaction score
38,509
When we misunderstand someone, we’re able to quickly course correct by picking nuances in how the person responds or by asking clarifying questions.

Like us, Alexa uses similar self-learning to automatically correct her mistakes by learning from customer feedback signals. These signals include vocal frustration such as “Alexa, that’s wrong” or a direct interruption such as “Alexa, stop.” Once Alexa determines a particular action was unsatisfactory, she automatically corrects herself.

Today, we’re taking this self-learning a step further by enabling customers to directly teach Alexa, which will help Alexa get smarter by asking questions to fill gaps in her understanding. This new capability helps Alexa get smarter by asking questions to fill gaps in her understanding—just like we do.

Teachable AI is an exciting step forward for all AI services.
As Alexa continues to become more natural to interact with, we wanted to to take it a step further. When we speak to another person, we get both verbal and non-verbal cues, and adapt our responses accordingly. Later this year, Alexa will adapt her responses based on the context of the conversation by adjusting her tone, stressing certain words, adding pauses and even breaths.
Two years ago, we took a step toward making Alexa more conversational with Follow-Up Mode, which lets customers make back-to-back requests to Alexa without needing to repeat the wake word. Today, we're introducing “natural turn taking,” the ability for customers to speak to Alexa without using a wake word during the course of a conversation—even when multiple people are talking. See this in action:

To make this experience a reality, we needed to solve several challenges. For example, are people speaking to each other? Or, should Alexa join the conversation? And, if she does, who should she respond to? To address these challenges, we had to go beyond voice-only understanding to multi-sensory artificial intelligence. Specifically, Alexa uses a combination of acoustic, linguistic, and visual cues with deep-learning-based models running locally on the Echo device. Once Alexa determines that a particular interaction turn is directed at her, she uses the context of the conversation to decide how to respond, or which specific action to take.

Natural turn taking is a major step forward in conversational AI, enabling customers to interact with Alexa at their own pace. We look forward to bringing this transformative capability to our customers next year.

Amazon Devices & Services news—September 2020
 
Back
Top Bottom