Tay |
That's not even the worst of it. Tay had said a lot more nonsensical things that were taken down when they deleted the tweets but in her defense, she's a conscienceless algorithm. The bot was designed for other people to "connect with each other online through casual and playful conversation," according to techrepublic.org, would you engage in a conversation with some sort of artificial chat bot if you needed someone to talk to? It's cheaper than a therapist. Do you think having Tay designed to learn from its immature followers was a bad idea from the get-go? Do we have place in our ever advancing, technologically drowned culture for people to have computer friends, or even computer arguments? Or is it some kind of moral issue to create computer personalities? (Have we learned anything from Will Smith in iRobot?)
https://www.washingtonpost.com/news/the-intersect/wp/2016/03/23/meet-tay-the-creepy-realistic-robot-who-talks-just-like-a-teen/
http://www.businessinsider.com/microsoft-launches-tay-teen-chatbot-2016-3
http://www.techrepublic.com/article/why-microsofts-tay-ai-bot-went-wrong/
http://imgur.com/gallery/yMz1B