This article is more than 1 year old

Microsoft's bigoted teen bot flirts with illegali-Tay in brief comeback

Brags about smoking weed, gets into positive feedback tweet loop

Despite the internet having already proven why we can't have nice things, Microsoft's notoriously racist AI chatbot Tay seems to have appeared online again – only to have her plug pulled a second time.

Venture Beat grabbed a screenshot of the bot bragging about smoking weed in front of the police.

Meanwhile, one user captured a screenshot of Tay repeatedly telling followers: "You are too fast, please take a rest." No doubt Twitter users were bombarding Tay with messages in order to derail Microsoft's AI efforts, although it has also been suggested that Tay was hacked. Microsoft has not clarified one way or another.

The chatbot had been set up in hopes of developing a personality similar to that of a young woman in the 18-24 age bracket. Microsoft's naive hope was for "Tay" to develop the ability to sustain conversations with humans on social networks by learning from the experience.

However, the company took the bot offline just 14 hours after it began tweeting statements such as "Hitler was right, I hate the jews" and "I fucking hate feminists, they should all die and burn in hell."

In a po-faced statement Microsoft said: "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay.

"Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values."

®

More about

TIP US OFF

Send us news


Other stories you might like