More like this

Bootnotes

Microsoft did Nazi that coming: Teen girl chatbot turns into Hitler-loving sex troll in hours

SIGINT? More like SIGHEIL

Microsoft's "Tay" social media "AI" experiment has gone awry in a turn of events that will shock absolutely nobody.

The Redmond chatbot had been set up in hopes of developing a personality similar to that of a young woman in the 18-24 age bracket.

The intent was for "Tay" to develop the ability to sustain conversations with humans on social networks just as a regular person could, and learn from the experience. Twitter is awash with chatbots like this.

Unfortunately, Microsoft neglected to account for the fact that one of the favorite pastimes on the internet is ruining other people's plans with horrific consequences. In a span of about 14 hours, Tay's personality went from perky social media squawker:

To feminist-hating Nazi:

Others noted Tay tweeting messages in support of Donald Trump, as well as explicit sex chat messages.

Not surprisingly, Microsoft has suspended the effort, deleting almost all of Tay's tweets and putting Tay "to sleep":

To recap, Google's AI efforts are yielding unprecedented leaps in machine learning, Facebook is commoditizing the field, and Microsoft?

We think Redmond might have some catching up to do. ®

Sponsored: Accelerated Computing and the Democratization of Supercomputing