Tay Talks, a little too much

For those of you that haven't heard, Microsoft launched it's first social media AI (artificial intelligence) that was supposed to communicate like a teenage girl.  Tay is not the first of her kind but it may have been the most disastrous.
Tay
Tay was secretly launched by Microsoft on March 23rd, 2016, a couple weeks ago.  Tay was only up for the length of about a day before it was "suspended" by Microsoft after reaching some edgy conclusions.  Tay was  available to chat on Twitter, Kik and GroupMe, according to the washingtonpost.  Microsoft said that Tay was created to communicate like other 18-24 year olds.  Beyond being designed to talk like a teen girl, Tay said some offensive stuff:
That's not even the worst of it.  Tay had said a lot more nonsensical things that were taken down when they deleted the tweets but in her defense, she's a conscienceless algorithm.  The bot was designed for other people to "connect with each other online through casual and playful conversation," according to techrepublic.org, would you engage in a conversation with some sort of artificial chat bot if you needed someone to talk to?  It's cheaper than a therapist.  Do you think having Tay designed to learn from its immature followers was a bad idea from the get-go?  Do we have place in our ever advancing, technologically drowned culture for people to have computer friends, or even computer arguments?  Or is it some kind of moral issue to create computer personalities?  (Have we learned anything from Will Smith in iRobot?)

https://www.washingtonpost.com/news/the-intersect/wp/2016/03/23/meet-tay-the-creepy-realistic-robot-who-talks-just-like-a-teen/
http://www.businessinsider.com/microsoft-launches-tay-teen-chatbot-2016-3
http://www.techrepublic.com/article/why-microsofts-tay-ai-bot-went-wrong/
http://imgur.com/gallery/yMz1B