Yahoo Web Search

Search results

  1. Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. [1]

  2. Mar 24, 2016 · But less than a day after her debut on Twitter, Microsoft’s chatbot–an AI system called “Tay.ai”–unexpectedly turned into a Hitler-loving, feminist-bashing troll. So what went wrong?

  3. Mar 24, 2016 · It took less than 24 hours for Twitter to corrupt an innocent AI chatbot. Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in "conversational ...

  4. Mar 25, 2016 · Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear Tay's views were a result of nurture,...

  5. Mar 24, 2016 · Yesterday the company launched "Tay," an artificial intelligence chatbot designed to develop conversational understanding by interacting with humans. Users could follow and interact with...

  6. Nov 25, 2019 · Microsoft's Tay chatbot started out as a cool teenage girl, but quickly turned into a hate-speech-spewing disaster.

  7. Mar 25, 2016 · Tay – a chatbot created for 18- to 24- year-olds in the U.S. for entertainment purposes – is our first attempt to answer this question. As we developed Tay, we planned and implemented a lot of filtering and conducted extensive user studies with diverse user groups.