Microsoft’s “Tay” Chatbot
- Product Type: Chatbot
Microsoft should have seen this coming. A lot of tech companies were interested in developing chatbots that could handle most business for clients, removing the need for human customer service. Yet this did not really go over well for the most part. In 2016, Microsoft found that out the hard way as they began trying to make them. Things went sideways with their “Tay” Chatbot. Initially, Tay did as hoped and learned by speaking to users… but this was the problem. Within 24 hours, Tay was taught to repeat numerous racist, sexist, and Anti-Semitic statements. Needless to say, Tay never really made it and is now in the hall of Microsoft’s failed tech products.