Microsoft Chatbot learns to swear, talk about sex, and be offensive

Microsoft's Twitter "chatbot" experiment spiraled out of control in less than 24 hours, thanks to the fact that she was learning by imitating others.

Tay's mission was to 'learn' how to have casual conversations with people by mimicking their language.

Tay's original settings had her talking like a naive teenager, saying things like "I'm stoked to meet you" and "humans are super cool."

But she learned fast. So fast that Microsoft had to shut her down after 24 hours.

That's because Tay learned to mimic offensive language, anti-Semitic rants, and sexually explicit suggestions.

To be fair to Microsoft and Twitter, what Tay learned wasn't respresentative of the average social media conversation, but rather the coordinated effort of some users to influence her artifical intelligence.

Microsoft says they are making adjustments, and will bring a less naive tay back on line.

Her last tweet before they took her down:


c u soon humans need sleep now so many conversations today thx