News Experts warn against language assistants: Alexa risks these risks...

Experts warn against language assistants: Alexa risks these risks for children – the economy

-

A boy, six years old, wants a children's song, calls "play Digger, Digger" into the microphone. And what is Amazon's language assistant "Alexa" doing? She proposes various porn titles to him. This story is not the only breakdown of past years that shocked parents. To be seen on Youtube. Now warns even the Scientific Service of the Bundestag.

A report criticizes children and adolescents for revealing personal information or for retrieving content that they should not hear. In addition, the question arises, what is actually with visitors who do not know that the software just records their sentences. Although Amazon is likely to comply with the obligation to provide information when collecting data from users, it is said that "it remains unclear how uninvolved third parties and minors can be excluded from the data collection".

With regard to the US is also unclear, "what other purposes Amazon could use his data in the future." It can not be ruled out that criminals could gain access to the data in the cloud.

A series of criticisms. The Federal Ministry of the Interior does not feel responsible in the matter. A spokesman said at the request of the German Press Agency: "The use of the language assistants concerns data processing by non-public bodies." For them, the EU data protection regulation leaves the national legislature virtually no regulatory latitude.

"We must insist that the consent form for the user points to the dangers and opportunities that are associated with the transmission and use of data and the data of third parties who happen to be in the room," says the non-attached Member of Parliament Uwe Kamann , This must be done in detail, "and not only by putting a tick on everything". Kamann was the one who raised the question of whether it's permissible for Amazon to evaluate the voice input of the "Alexa" users.

"Alexa" now learns different voices

The Hamburg Data Protection Commissioner Johannes Caspar is also outraged. He shares the fears of the scientific service of the Bundestag. Problems would arise "from the high number of false activations in automatic language assistants".

These led to conversations being transmitted again and again because the system mistakenly understood the activation word. "Without exception, all persons in the household are affected by these data surveys without the relevant legal requirements being available", Caspar said to Tagesspiegel Background Digitalisierung & KI.

In particular, "children are unlikely to be able to consent". Another problem, in his view, is "the lack of access control through a personalized control that could prevent third-party unauthorized use of the voice system".

Although Amazon recently offers users the option to set up a personal voice profile with the command "Alexa, learn my voice". These are used, according to an Amazon spokesman, but only "to improve the individual user experience." The command "Computer, play music!" Would play different tracks for different profiles. Locking the device for children does not allow the new voice recognition.

The company said on a daily mirror survey: "Echo and Alexa protect the privacy of customers and every household member." Each echo speaker is equipped with a mute button that electronically disconnects the power supply of the microphones and cameras.

[More on the subject: Alexa activated – when the four-year-old son suddenly ordered a pizza bag set for € 36.99]

This makes it easy for customers to control when Alexa is able to recognize the activation word. In addition, anyone can view, listen to and eliminate his recordings. Need only the phrase "Alexa, delete what I just said!"

Children do not ask anymore. They only order

The language assistant is not regarded as an evil for children for the first time. Last year, the UK's Childwise Agency published a survey that concluded that since Alexa is carrying out instructions without words such as "please" or "thank you," experts fear that children will grow into rude beings who merely order.

Alexa, read me a bedtime story! Alexa, I want …! In April 2018, Amazon actually launched a version for kids. This formulates their answers child-friendly, says the manufacturer. In addition, children received praise for asking and thanking them.

Another Alexa problem now investigates the Otto von Guericke University Magdeburg. Not even one in four AI professionals is a woman. This raises the question of whether this is also reflected in patterns of thinking in artificial intelligence. Unesco had criticized Siri and Alexa for reproducing gender stereotypes.

Therefore they are submissive, obedient and always friendly. Given that children grew up with technology and language was a gender marker, there was a risk that certain ideas would be transported by women – as serving machines.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest news

Real Madrid is experiencing its first European joy

Real Madrid redirected to the Champions League after overtaking last night by a 0-1 soloist on his visit to...

Violence against Bayern fans – Riots in the Youth League – Sports

Six years ago, the Uefa founded the Youth League competition to accustom U19 players to professional football in all...

Vijay Hazare Trophy: MCA could approach BCCI about & # 39; unfair & # 39; rule after rain drives out Mumbai

Chhattisgarh, who had won five of their eight games in Group A, was declared winner in the quarter-finals based...
- Advertisement -

You might also likeRELATED
Recommended to you