NASTY ANSWERS TO QUERIES

Microsoft’s AI Chatbot Stuns Users With Its Hostility

Bing compared an Associated Press reporter to Hitler and Stalin after the news service questioned its skill

Janice Harayda
2 min readFeb 17, 2023

--

Virtual assistant from Piqsels on Wikimedia Commons / Public domain

Microsoft’s retooled Bing search engine gives new meaning to the phrase, “Don’t disrespect the Bing.”

On The Sopranos, variations on that warning referred to the Bada Bing strip club, where Tony Soprano did business.

But users of the newly AI-enhanced Bing chatbot are facing different risks than did the enemies of the fictional mob boss.

After hearing that nasty answers were coming from Bing, an Associated Press reporter had a long conversation with the chatbot. The bot faulted the AP’s coverage of its mistakes, vehemently denied the errors, and “threatened to expose the reporter for spreading alleged falsehoods about Bing’s abilities,” the news organization’s Matt O’Brien wrote.

The chatbot didn’t stop there.

“It grew increasingly hostile when asked to explain itself, eventually comparing the reporter to the dictators Hitler, Pol Pot and Stalin and claiming to have evidence tying the reporter to a 1990s murder.”

--

--

Janice Harayda

Critic, novelist, award-winning journalist. Former book editor of the Plain Dealer and book columnist for Glamour. Words in NYT, WSJ, and other major media.