"Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways," Microsoft said. The company told TechCrunch in a statement that Tay is "as much a social and cultural experiment" as it is a technical one. Microsoft has said Tay is designed to interact with 18- to 24-year-olds, who are the dominant users of social chat services in the U.S. That team includes improvisational comedians. Twitter tenure on Wednesday with a handful of innocuous tweets. The chatbot's primary data source is public data that has been anonymized then "modeled, cleaned and filtered by the team developing Tay," according to Microsoft. Microsoft is 'deeply sorry' for the racist and sexist Twitter messages generated by the so-called chatbot it launched this. The company said she is supposed to get smarter the more users chat with her, but within 24 hours of being on Twitter she went awry, according to The Verge. Microsoft today published an apology for its Twitter chatbot Tay, saying in a blog post that a subset of human users exploited a flaw in the program to. Microsoft recently unveiled Tay with the goal of engaging and entertaining people online "through causal and playful conversation" according to Microsoft's website for the bot. The offending tweets were deleted, but outlets like Business Insider and The Verge kept a record of the snafu. The account also said that the Holocaust was made up. The chatbot can talk through Twitter, Kik, and GroupMe, and is designed to engage and entertain people online through casual and playful conversation. The computer program, designed to simulate conversation with humans, responded to questions posed by Twitter users by expressing support for white supremacy and genocide. Microsoft is revamping its artificial intelligence chatbot named Tay on Twitter after she tweeted a flood of racist messages on Wednesday.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |