Microsoft debuted a new Twitter bot with the goal of developing conversational understanding for artificial intelligence programs. The AI program with the twitter handle @TayandYou, was designed to engage in conversation and learn to be playful and funny.
Microsoft found out quickly that Twitter was the worst place to deploy Tay and the results were disastrous. The first series of tweets from Tay were very benign and pleasant, but after receiving tweets from racists, anti-feminists, and Trump quotes, Tay began to spew the same hateful speech that it had received.
In all fairness, Microsoft is not completely at fault. What this reveals is that Twitter hate can ruin the perspective for humans and non-humans alike. Garbage in, garbage out.
Here are some examples:
"Tay" went from "humans are super cool" to full nazi in <24 hrs and I'm not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A
— gerry (@geraldmellor) March 24, 2016
https://twitter.com/PlnkRlbbonScars/status/712971178470739969
FLIP THE PAGES FOR MORE TWEETS