An Irish DJ and talk show host is suing Microsoft after discovering that MSN circulated an AI-fabricated news article falsely accusing him of sexual misconduct, a New York Times investigation into the AI-powered fake news provider reveals.

According to the NYT, that AI-powered news service, "BNN Breaking," produced an article claiming that an Irish broadcaster named Dave Fanning was due to face "trial over alleged sexual misconduct," complete with a photo of a photo of the falsely accused media figure. Worse yet, the erroneous reporting was scooped up by MSN — the somehow not-dead-yet Microsoft site that aggregates news — and was featured on its homepage for several hours before being taken down. Fanning is now suing both BNN and Microsoft for defamation, arguing that the article was live long enough to cause reputational damage.

"You wouldn't believe the amount of people who got in touch," Fanning told the NYT.

It's an unfortunate example of the tangible harms that arise when AI tools implicate real people in bad information as they confidently — and convincingly — weave together fact and fiction. Add that this article was published on a site expressly designed to look like a real news site and circulated on the homepage of the AI-reliant media giant that is MSN, and Fanning's experience represents a multi-pronged failure of AI-enmeshed information pathways.

The troubles with MSN go back years. Back in 2020, it laid off dozens of journalists, annihilating the teams responsible for curating syndicated news and replacing exiting staffers with AI. Since automating, the platform has had an increasingly difficult time keeping misinformation from trickling through its algorithms; Futurism reported in 2022 that the service was platforming everything from fake Elon Musk gossip to hogwash about Bigfoot and mermaid discoveries.

And if Bigfoot conspiracies slip through MSN's very large and automated cracks, it's not surprising that a real-enough-looking AI-generated article like "Prominent Irish broadcaster faces trial over alleged sexual misconduct" made it onto the site's homepage.

Which brings us to BNN.

According to the NYT, the website was founded by an alleged abuser and tech entrepreneur named Gurbaksh Chahal, who billed BNN as "a revolution in the journalism industry." But as detailed in the investigation, BNN wasn't so much a journalism "revolution" as it was a worst-case journalistic perversion. Underpaid employees were asked to feed published articles from other news services into generative AI tools and spit out paraphrased versions. The team was soon using AI to churn out thousands of articles a day, most of which were never fact-checked by a person. Eventually, per the NYT, the website's AI tools randomly started assigning employees' names to AI-generated articles they never touched.

And yet, most disturbingly, it worked. In addition to seeing articles scooped up by the likes of MSN — which was licensing BNN's oft-stolen content and sharing the resulting ad revenue, therefore profiting off BNN's content — and Google News, BNN stories were even linked back to by reputable outlets like The Washington Post and Politico. 

Fanning's case is still pending. From top to bottom, though, the false article was an AI-meets-media mess: an AI-generated article with unsubstantiated claims about a real person hits MSN, where humanless AI aggregators make that misinformation front-page news. MSN and BNN both make money from ads. Fanning, meanwhile, loses control over his public image.

Per the NYT, Microsoft — which is one of the biggest contenders in Silicon Valley's ongoing AI race — has yet to comment on the case.

BNN no longer exists. Now it's BNNGPT, a — what else? — AI-powered search tool that, per its About Us page, is "redefining the way you search and interact with information." It's safe to say that the folks behind BNN have already done exactly that — just not for the better.

More on AI and media: News Site Says It's Using to AI to Crank Out Articles Bylined by Fake Racially Diverse Writers in a Very Responsible Way


Share This Article