Alexa: «Kill Your Adoptive Parents»

Alexa: «Kill Your Adoptive Parents»

Amazon is working with its virtual assistant to be able to imitate human jokes and talk about almost everything found on the internet. However, making sure that it does not offend users is a challenge What s speakers smart have come to stay in our lives. Millions of people around the world already have Amazon’s famous Echo devices in their homes, able to report on the weather in your city, order food and handle other basic tasks in response to a voice command.

But this device is not just there. Last year surprised everyone when a user said: « Kill your adoptive parents ». But it has not been the only expression with which the assistant has surprised users. As Reuters recalls, Alexa has also talked with users about sexual acts or defecation of the dog.

The reality is that Alexa is not having any crisis. These curious episodes arise from the strategy that the company is carrying out to get its device to communicate better and better. In fact, new research indicates, according to Reuters, how Amazon is working with Alexa to be able to imitate human jokes and talk about almost everything found on the Internet. However, making sure that it does not offend users is a challenge.

What is at stake is a rapidly growing market of devices with virtual assistants. According to the eMarketer research firm, approximately two-thirds of US smart speaker customers. , (43 million people), use Amazon’s Echo devices. The company focuses, therefore, on maintaining the advantage over its competitors.

“Many of our AI dreams are inspired by science fiction,” said Rohit Prasad, Amazon vice president and chief scientist at Alexa Artificial Intelligence (AI), during a talk last month in Las Vegas. To make that happen, the company launched the annual Alexa Award in 2016, enrolling computer students to improve the assistant’s conversation skills. The teams compete for the first prize of $ 500,000, creating “chatbots” that allow Alexa to try more sophisticated conversations with people.

The project has been important for Amazon CEO Jeff Bezos, who approved that the company’s customers be used as guinea pigs, a person told Reuters. And is that Amazon is willing to accept the risk of public errors to make a stress test of technology in real life and get Alexa to learn much faster. The experiment is already bearing fruit. The college teams are helping Alexa to have a greater variety of conversations. And Amazon customers have also given bots better grades this year than last year.

But Alexa’s mistakes are taking their toll. So much so that Bezos has sometimes ordered staff to close a “bot”. The user who was told to kill his adoptive parents wrote a very harsh criticism against the company on his website, describing the situation as ” a completely new and spooky level “. When the company investigated the incident, it found that the robot had cited a contextless Reddit publication.

The real problem is that privacy implications can be even more complicated. Consumers may not realize that some of their most sensitive conversations are being recorded by Amazon devices. On Thursday, the company said that a “human error” allowed an Alexa client in Germany to accidentally access another user’s voice recordings.

Possible data leaks

“The potential uses of Amazon datasets are off the charts,” said Marc Groman, privacy and technology policy expert at Georgetown Law. ” How will they ensure that, when sharing data, they are used responsibly? » In fact, the expert advances that can end in a “catastrophe based on data”, as has happened with Facebook.

In July, Amazon discovered that one of the bots designed by students had been “hacked.” This compromised a digital key that could have unlocked the transcripts of the robot conversations, without the names of the users. The company quickly disabled the bot and had the students rebuild it for safety.

The company acknowledged the “hacking” in a statement: “At no time were the internal systems of Amazon or the identification data of the customers affected.” Amazon refuses to give details of Alexa-specific errors reported by Reuters but said it works to protect users from offensive content.

Their teams work with tools that help them filter the soft and delicate issues, which can detect even subtle offenses. The company also scans the transcripts of conversations and closes the transgressing robots until they are solved. “These cases are quite rare, especially due to the fact that millions of customers have interacted with socialbots,” Amazon said.

Like Google’s search engine, Alexa has the potential to become a dominant gateway to the Internet, so the company goes ahead. “By controlling that gateway, you can build a super profitable business,” said Kartik Hosanagar, a Wharton professor who studies the digital economy.

Open Pandora’s box

Amazon’s commercial strategy for Alexa involves addressing a massive research problem: How to teach the art of conversation to a computer? Alexa is based on machine learning, the most popular form of artificial intelligence to work. These computer programs transcribe human speech and then respond to that input with an assumption based on what they have observed before. The software “learns” from new interactions, gradually improving over time.

The goal of Amazon is to get an assistant capable of engaging in natural and open dialogue. That requires Alexa to understand a broader set of verbal cues from clients, a task that represents a challenge even for humans.

Posted by Susan Daigle

Susan Daigle is a passionate writer and she usually writes the technology content in a pattern ways through which users can easily be attracted and she has written many technological reviews of the products. Twitter @susandaigle23

Leave a Reply

Your email address will not be published. Required fields are marked *