Not Microsoft’s finest moment
Somehow or another, Microsoft failed to anticipate that Twitter users would intentionally try and corrupt its millennial AI chatbot named Tay. For that, Microsoft is “deeply sorry.”
“As many of you know by now, on Wednesday we launched a chatbot called Tay. We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” …read more