One of the officials Alexa think that improving the services offered by the master can only come from installed on your mobile phone and know about its immediate environment of the robot.
Amazon constantly working to make his assistant Alexa more useful every day. If the current version is already doing a lot of service to its millions of users, e-retailers want to go much further.
Alexa a robot to understand the world around
Rohit Prasad, a researcher on the team, which is responsible for Alexa, spoke at the conference Digital AI EmTech, organized by the MIT Technology Review in San Francisco. He explained that " the only way to really make a smart personal assistant to offer eye they explore the world ".
Some devices with Alexa already have a camera and can see the immediate environment in which they are located. But if we stick to the argument Rohit Prasad, the ultimate goal is to design the Amazon The robot is controlled by Alexa and is able to develop itself.
Then the voice assistant to absorb the subtleties of human language
The scientist did not go to confirm this development in his speech. However, he pointed out that the work on such topics as Natural Language Processing and other features of machine learning, basic skills in the development of Alexa robot, if it ever came to a head Amazon launches its design.
" Alexa can determine if we talk about a person, place or object & # 39. But this is more a hack route to this exploration. There are many situations where a proposal is always ambiguous meaning "Adds Rohit Prasad.
Give the physical body of the voice assistant is not & # 39 is absurd. This would enable them to perform complex tasks requesting the visual context and other subtle information that the human brain is supposed to. For one day, perhaps, we will continue to provide assistance in our daily lives.
Source: The Verge
This article interests you?
Subscribe to our newsletter and receive every day, the best of high-tech innovations and digital.