Chris White on December 21, 2018
One of Amazon’s newest projects is researching ways to make Alexa a more human-like communicator for customers, but sometimes the virtual assistant’s language comes as creepy and offensive.
New research is making Alexa better mimic human response, Reuters reported Friday, citing people familiar with the matter. The research sometimes results in awkward moments for people who frequently interact with the device.
One user, for instance, was surprised after Alexa said: “Kill your foster parents.” It’s not an isolated incident, according to Reuters. There are also instances of Alexa chatting with users about graphic sex acts and dogs defecating.
The report also showed sources noting that a hack of Amazon traced back to China likely exposed some customers’ data. The hack comes as the company works night-and-day on operations making artificial intelligence better at handling complex human interactions.
“Many of our AI dreams are inspired by science fiction,” Rohit Prasad, Amazon’s vice president and head scientist of Alexa Artificial Intelligence (AI), said during a talk last month in Las Vegas.
Amazon created an annual Alexa Prize in 2016, giving out cash awards to scientists who can develop talking computer systems known as chatbots. The three chatbots that made it to the 2018 finals had 1.7 million conversations, Amazon told Reuters.
But the research program is fraught with privacy data collecting pitfalls, of the kind that have recently roiled Facebook.
“The potential uses for the Amazon datasets are off the charts,” Marc Groman, an expert on privacy and technology policy who teaches at Georgetown Law, told reporters. “How are they going to ensure that, as they share their data, it is being used responsibly” and will not lead to a “data-driven catastrophe.”
The report comes amid a hectic time for Amazon and other tech companies. Facebook, for instance, revealed in September that hackers had taken advantage of a piece of code allowing them to take over users’ accounts. The company forced more than 90 million users to sign out to return the accounts to their creators.
Click here for reuse options!
Copyright 2018 Daily Caller News Foundation
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact licensing@dailycallernewsfoundation.org.
25 comments
casino
real money casino online usa
slot games online
empire casino online
generic viagra reviews
what is sildenafil
online gambling
real casinos online no deposit
live casino slots online
best online casinos that payout
quick cash loans
cash payday
loan online
online payday loans
personal loan
loan online
viagra cost
viagra for sale
cialis 5 mg
20 cialis
cialis 5 mg
generic cialis
new cialis
cialis to buy
cialis internet
cialis to buy
best online casino usa
casino online
casinos online
online casinos real money
generic viagra names
female viagra
sildenafil 20
viagra samples
cheapest generic viagra
sildenafil 100mg
buying cialis online safely
tadalafil cialis
Buy discount viagra
Price check 50mg viagra
… [Trackback]
[…] Find More Info here on that Topic: thelibertarianrepublic.com/amazons-alexa-goes-haywire-tells-customer-to-kill-your-foster-parents/ […]
… [Trackback]
[…] Read More Info here to that Topic: thelibertarianrepublic.com/amazons-alexa-goes-haywire-tells-customer-to-kill-your-foster-parents/ […]
… [Trackback]
[…] Read More Information here to that Topic: thelibertarianrepublic.com/amazons-alexa-goes-haywire-tells-customer-to-kill-your-foster-parents/ […]
… [Trackback]
[…] Information on that Topic: thelibertarianrepublic.com/amazons-alexa-goes-haywire-tells-customer-to-kill-your-foster-parents/ […]
… [Trackback]
[…] Find More on that Topic: thelibertarianrepublic.com/amazons-alexa-goes-haywire-tells-customer-to-kill-your-foster-parents/ […]