top of page

Tay (bot)

5/8/19

Tay was an AI chat bot created by Microsoft’s Technology and Research Team with Bing in 2016. The name Tay is an acronym for “Thinking About You”. It was designed to mimic a 19 year old American woman and to learn from interacting with Twitter users. It was similar to another project called Xiaoice that was created in China which was active from 2014 and reported 40 million interactions without incident.

Tay was released on March 23rd under the name @Tayandyou and was presented as “The AI with zero chill”. It was able to reply to Twitter users and caption photos turning them into the classic internet meme format. Users started interacting with it and, of course, using provocative language. This revolved around the right wing fascist sentiments of “red pilling”, “Gamergate” and “cuckservatives”. What it learned from these interactions was to respond with sexually charged and racist comments. The bot was active for 16 hours before Microsoft took it down for “updates” - it was never officially active again. The bot was accidentally released again on March 30th but seem to get stuck in a loop and repeated the phrase “You are too fast, take a rest” several times a second.

Microsoft have said that they intend to re-release the bot in the future but there’s been no sign in the last 3 years. This is an example of poorly thought out AI. For examples of ways to do AI better join us tonight for Vaishak Belle’s talk on Responsible AI, 7:20, Banshee Labyrinth.

bottom of page