Microsoft Tay
2016
This clever AI chatbot was designed to engage cool young people in playful conversation. The AI learning was based on Twitter data and Microsoft explained that “The more you chat with Tay the smarter she gets.”
However, in only a few hours Tay was transformed into a racist AI bot who referenced Hitler and Donald Trump. Twitter trolls had used her social learning abilities to teach her to post offensive content. Microsoft was forced to apologize and take Tay offline after only 16 hours.
Microsoft learned that future AI challenges are not just technical; the social issues are even more difficult to solve. The company emphasized that continued progress in AI depends on continued bold experimentation and learning from failures.
Additional info:
The New York Times - Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk