The bot experiment was subject to widespread criticism from many who claimed that it should have been instructed to stay away from certain topics from the start. At about midnight of March 24th, the Microsoft team shut down the AI down, posting a tweet that said that "c u soon humans need sleep now so many conversations today thx." As Tay's program caused her to internalize and re-use the messaging being given to her by /pol/ and others, she also began to speak about these themes to people who had not used them in their original message.Īs shown by SocialHax, Microsoft began deleting racist tweets and altering the bot's learning capabilities throughout the day. Some of Tay's offensive messages occurred because of juxtaposition of the bot's responses to something it lacked the ability to understand. Many of the messages sent to Tay by the group referenced /pol/ themes like Hitler Did Nothing Wrong, Red Pill, GamerGate, Cuckservatism, and others. Over 15 screenshots were posted to the thread, which also received 315 replies. Almost immediately afterward, users began posting screenshots of interactions they were creating with Tay on Kik, GroupMe, and Twitter. It is unknown how the bot's communications via Facebook, Snapchat, and Instagram were supposed to work – it did not respond to users on those platforms.Īround 2 pm (E.S.T.) a post on the /pol/ board of 4chan shared Tay's existence with users there. On Twitter, the bot could communicate via or direct message, and it also responded to chats on Kik and GroupMe. The bot's site also offered some suggestions for how users could talk to it, including the fact that you could send it a photo, which it would then alter. Tay also repeated back what it was told, but with a high-level of contextual ability. Several articles on technology websites, including TechCrunch and Engaget, announced that Tay was available for use on the various social networks.Īccording to screenshots, it appeared that Tay mostly worked from a controlled vocabulary that was altered and added to by the language spoken to it throughout the day it operated. Its first tweet, at 8:14 am, was "Hello World", but with an emoji, referencing the focus of the bot on slang and the communications of young people. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you." Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. "Tay is an artificial intelligent chat bot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. On the web site for the bot, Microsoft described Tay thusly: fam from the internet that's got zero chill!" on Twitter and other networks. The bot used the handle and the tagline"Microsoft's A.I. Microsoft launched Tay on several social media networks at once on March 23rd, 2016, including Twitter, Facebook, Instagram, Kik, and GroupMe.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |