She is an artificial intelligence chat bot designed by Microsoft to communicate with millennials, or as she puts it, "Microsoft's A. Within 24 hours, she was taken offline for "adjustments" after she began spouting racist comments, demands for genocide, and praise for Hitler.
"Tay's social media accounts went live on Wednesday morning.
Tay was created using "relevant public data," artificial intelligence, and editorial content developed by a staff that included improvisational comedians, according to Microsoft.
As experts say should have been predictable, online trolls inundated Tay's Twitter account with offensive statements and inappropriate questions, often urging her to repeat vulgar comments.
In a statement, a Microsoft spokesperson said Tay "is as much a social and cultural experiment, as it is technical.""Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways," Microsoft said.
"As a result, we have taken Tay offline and are making adjustments."Many of Tay's more outrageous pro-genocide and anti-minority commentary was quickly pulled down by Microsoft, but they live on in screen caps by various media outlets.
Some offensive material does remain among the 96,000 tweets that are still online."Depends who you ask," she said in response to a question about whether Hitler was right. Her final message early Thursday morning before going offline said, "c u soon humans need sleep now so many conversations today thx."Game developer and anti-harassment activist Zoe Quinn criticized Microsoft in a series of tweets after Tay directly insulted her."If you're not asking yourself 'how could this be used to hurt someone' in your design/engineering process, you've failed," she wrote.
"This will happen," she responded to one request for cybersex. While perhaps not exactly what Microsoft set out to demonstrate, the Tay experiment provides a unique insight into the cesspool of hate that Twitter users can quickly find themselves waist-deep in if they are not careful."There are a bunch of people with way too much time on their hands on Twitter and many of them want to try to turn the tables," said Dr.
She rebuffed other inquiries about sex acts, nude photos, and nuclear launch codes with emoji, pop culture gifs and occasional bizarre non-sequiturs. Sameer Hinduja, a professor of criminology at Florida Atlantic University and co-director of the Cyberbullying Research Center."Anything that might possibly have a vulnerability and is in the public eye...
Most of what is left on Tay's timeline now is attempts at slang, jokes, requests for photos ("er mer gerd erm der berst ert commenting on pics. "), and invitations to direct message with her ("DM me whenev S u want. I feel like it's free reign for people to try to attack," he said.
Computer scientists were surprised that Microsoft released Tay into the wild without some sort of language filter to at least prevent her from saying some of the offensive words she repeated.
Michael Litttman, a professor of computer science at Brown University, could not think of any reason why engineers would not have included such measures."Microsoft was trying to show off by popping a wheelie on a bike or something and they ran into a tree," Littman said.