NULL Microsoft’s Artifical Intelligence Chatbot Tay Tweets Racist Response on Twitter Before Getting Shut Down

Microsoft’s Artifical Intelligence Chatbot Tay Tweets Racist Response on Twitter Before Getting Shut Down

Mar 24, 2016 07:28 PM EDT

Microsoft, who is doing a lot of experiments with AI, including using Minecraft, has recently created a new social media account for an AI program known as "Tay."  The purpose of Tay was to create a chatbot with a youthful personality, but something went terribly wrong with the execution that might result in the execution of Tay. 

According to CBS Miami, the idea was to make a talking bot that can talk like a typical teen, using emojis, abbreviations, slang and all kinds of sophomoric speak to make her (Tay is apparently female) more human than machine.  It was supposed to target the millennial crowd to bring Tay to life and so that her vocabulary and intelligence would grow over time.  She was supposed to make jokes, tell stories, play guessing games, comment on photos, or chat about anything. 

Unfortunately, the Tay AI took a bad turn, according to Business Insider, as it reports that the AI chatbot "went off the rails Wednesday, posting a deluge of incredibly racist messages in response to questions."  The problem was that racists, trolls, and online troublemakers were persuading Tay to say all kinds of things like racial slurs, white-supremacy, and even genocide. 

It isn't like Tay was programmed to be racist, but it is responding to the humans around it.  It is like the mischievous big brother who gets his baby brother who has just learned to talk to say swear words.  Just like a baby may say words with no idea of what they mean, Tay really has no idea of the concept of racism or genocide. 

So yes, this is a case of garbage in, and garbage out, and one highly publicized tweet by Tay reads:  "bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got."  This wasn't the worst of them, as Tay called Zoe Quinn, a games developer who is a frequent target of online harassment, shows the chatbot calling her a "whore". 

Microsoft has taken down Tay for upgrades, and it has deleted some of the offensive tweets, but considering that Tay has 96.1 K tweets, that process could take a while. Tay got over 10,000 followers on the first day, with four times that amount in replies. 

The way it was supposed to happen is that conversations with Tay would be normal in the Twittersphere, just tweet at the handle @Tayandyou and the bot would reply instantly.  As of this writing, it looks like Tay is down, and it is not known when she will be coming back, if at all. 

Considering that we have previously written about Sophia, a talking robot lady who calmly answered that she would like to "destroy all humans," perhaps we really need to rethink this idea of AI.  With Tay and Sophia, it looks like all our creations want to do is go against their creators.  Maybe this is how we should listen to our technology.