It's reasonable to say that when Amazon introduced the first Echo speaker in autumn 2014, most people were not entirely sure what to do. In recent years, Echo and the wider universe of Alexa-driven devices have passed from curiosity to ubiquity. But while you can find Alexa in just about everything – including, yes, a microwave – the real advance, Amazon's voice assistant made in 2018, came less than the width than from the depth.
It's not to say that it has not made scale increases. Amason's voice assistant has doubled the number of countries where it is available; firstly, learn to speak French and Spanish along the way. More than 28,000 smart home appliances work with Alexa now, six times as many as at the beginning of the year. And more than 1
That's how Alexa develops under the hood, though, that has defined this year and how it will continue to empty against the full potential of those who come. Alexa has become smarter, in ways so subtle you may not have noticed.
Because many voice assistant enhancements aim to reduce friction, they are almost invisible in design. Over the past year, Alexa has learned to transfer context from one search to the next, and to register follow-up questions without having to repeat the wake word. You can ask Alexa to do more than one thing in the same request and summon a skill-Alexa's version of apps-without having to know its exact name.
So-called active learning, where the system identifies areas where it needs the help of a human expert, has helped reduce Alexa's failure rates.
They may sound like little tweaks, but cumulatively represents the big advances against a more conversational voice assistant, one that solves problems rather than introducing new frustrations. You can talk to Alexa in a far more natural way than you could a year ago, with a reasonable expectation that it will understand what you are saying.
These gains have come, surprisingly, through continued introduction and refinement of machine learning techniques. So-called active learning, where the system identifies areas where it needs the help of a human expert, has significantly reduced Alexa's error rates. "It is fed into all parts of our pipeline, including speech recognition and natural language understanding," said Rohit Prasad, vice president and chief researcher at Amazon Alexa. "It makes all our machine learning models look better."
Recently, Amazon introduced the so-called transferring training to Alexa. Prasad gives the example of trying to build up recipe skills from scratch, as everyone can do thanks to Amazons newly introduced skills "blueprints". Developers can potentially exploit everything Alexa knows about restaurants, sayings or groceries to help cut down on the shallow work they would otherwise encounter. "Essentially, with deep learning we can model a large number of domains and transfer that learning to a new domain or skills," says Prasad.
The benefits of the improvement of machine learning are manifested on all sides of Alexa, but the simplest argument for its influence is that the system has seen a 25 percent reduction in the error rate in the last year. There is a significant amount of headaches. Echo owners no longer need to handle.
And more advances are incoming. Just this month, Alexa launched self-learning, which allows the system to automatically make corrections based on contextual views. Prasad returns an example: Say you ask Echo to "play XM Chill" and the request fails because Alexa does not catalog the drive that way. If you follow up saying "play Sirius channel 53" and keep listening, Alexa will learn that XM Chill and Sirius Channel 53 are the same, all in themselves. "It's a big deal for AI systems," said Prasad. "It's here to learn from implicit feedback."
The next border, but becomes a bit more difficult. Amazon wants Alexa to become smarter, of course, and better to anticipate your needs at any given time. However, it also means that Alexa understands better not only what you say but how you say it. "
" When two people speak, they are actually quite good at understanding the feeling. But these systems are at all clueless about it, "said Alex Rudnicky, a speech recognition expert at Carnegie Mellon University. "People are trying to develop skills that make them a little more sophisticated, more human in their ability to understand how a conversation goes."
Amazon has already made headlines this fall over a patent describing technology that allows Alexa to recognize the feelings of users and respond accordingly. These headlines were not glowing. A device that always listens to you is already a step far too many; One who learns how you feel escalates the discomfort dramatically.
Prasad says the ultimate goal of Alexa is long distance call skills. As part of it, it may respond differently to a given question based on how you asked it. And while it's important to have these conversations now, it's worth noting that a voice assistant who really understands the features of the intonations will remain for the most part a way.
"If you look at the big five feelings, Rudnicky says," The only thing people have managed to detect is anger. "
When the number of Alexa devices has exploded, they also have the skills Amazon now accounts for 70,000 of them in its stable, from quizzes to games for meditation and more. It's seven times the number it was just under two years ago.
Here's Alex's room for improvement starting to show. have been better at predicting the skills people may want, but finding remains a real problem. Not only do Alexa owners miss out on potential applications for their devices beyond a fancy kitchen hours, developers have less incentive to invest time on a platform where they can be invisible.
The answer can not come solely from deep learning either. It can survive the most relevant skill at all times, but voters have so much potential beyond immediate functional b Think of skills like The Magic Door An interactive fantasy game on Alexa that started in 2016. If all you've been using Alexa for is listening to NPR and checking the weather, it's hard to see how the algorithm would Notify you of its existence. And even simpler suggestions are not necessarily always welcome.
"It can be an engaging experience if we introduce customers to new skills and new capabilities, if it is highly relevant to what they do," says Toni Reid, vice president of Alexa's experience and echo devices. "But you must be very careful in these usage cases because it may be overload. It's the right time at the right moment, the right amount of content."
Amazon must also figure out how to turn off Google if Google Assistant has closed the voice control gap significantly despite a late start. Canalys Research estimates that 6.3 million Echo smart speakers are broadcast in the third quarter of this year, just before Google's 5.9 million smart speakers.
The race is not as close as the numbers make it work; It does not include third party devices, a venue where Alexa dominates, and a three-month snapshot highlights the great installation base Amazon has built over the last four years. Nevertheless, Google has benefits that Amazon can not ignore.
"They had many years of experience with AI, while Alexa was built from scratch," said Canalys analyst Vincent Thielke. "Because Google's AI was so advanced, it was very easy to record." Similarly, by virtue of Android, Android Auto and WearOS, Google has several sites that may mislead Google Assistant. With the British phone's spectacular failure, which was launched in 2014, Amazon's mobile capabilities are limited. The company is far better in cars, but is still behind Google and Apple in native integrations, which has led to the introduction of hardware add-ons like Echo Auto.
However, Alexa has not shown signs of slowing down. It is now the Alexa Guard to look after your home when you're gone. It is Alexa Answers, a kind of voice assistant hybrid quora and Wikipedia. There are Alexa Donations and Alexa Captions and Alexa Hunches and Alexa Routines.
There is a lot. But if you want to know where Alexa is going next, well, you know who to ask.
More Great WIRED Stories