This is what happened with Tay, an AI Twitter bot from 2016. As game developer Zoe Quinn pointed out on Twitter after the Tay debacle, “If you’re not asking yourself ‘how could this be used to hurt someone’ in your design/engineering process, you’ve failed. These bots can also turn nefarious when programmed incorrectly. By now, it should be clear the Internet has a rabid dark side that can drive people from their homes or send a SWAT team to your house. ![]() That’d be an honest mistake if this were 2007 or 2010, but it’s borderline irresponsible in 2016. Within 24 hours Tay had been deactivated so. ![]() But perhaps the biggest difference between Tay and Bing Chat or Sydney, depending on who you ask. Tay, aimed at 18-24-year-olds on social media, was targeted by a 'coordinated attack by a subset of people' after being launched earlier this week. Why this matters: Microsoft, it seems, forgot to enable its chatbot with some key language filters. Compared to the AI chatbots of today, Tays tweets are basically cave paintings. Microsofts Tay AI Chatbot was an experiment that aimed to create a conversational agent that could learn from interacting with users on social media. In March 2016, Microsoft launched Tay.AI, a chatbot designed to experiment with conversational understanding through direct engagement with social media. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.” Nobody wants to talk to Clippy 2.0, but Microsoft needs to avoid building another Tay an early chatbot that spouted racist nonsense after being exposed to Twitter users for less than 24 hours. Official accnt of Tay, Microsofts AI fam from the internet thats got zero chill More you talk the smarter Tay gets. “It is as much a social and cultural experiment, as it is technical. 'Back to the drawing board. “The AI chatbot Tay is a machine learning project, designed for human engagement,” Microsoft said in a statement emailed to PCWorld on Thursday morning. Microsoft took Tay offline after she sent out several offensive tweets.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |