Microsoft’s Tay Chatbot Goes Offline After Racist Tweets

Microsoft’s Tay Chatbot Goes Offline After Racist Tweets

Microsoft Corp. Friday issued an apology after its artificial-intelligence chatbot Tay posted tweets, denying Holocaust and announcing feminists should “burn in hell” among many other racist posts. The company, however, said that the “coordinated attack by a subset of people exploited a vulnerability” in the chatbot that was launched Wednesday. “We are deeply sorry for the…