For young athletes, there are few moments as exciting as when they first see their name in the newspaper — preferably for a goal scored, or a save made, and extra points if a local reporter asks for a quote.

That dynamic is now on the line at Gannett, the publisher of USA Today and many other regional newspapers, where it was forced to pause the publication of abysmally low quality AI-generated articles about high school sports.

We were curious: how would a sports writer at a Gannett publication feel about the AI articles? So we asked one, though we're keeping them anonymous and not sharing which newsroom they work at to protect their job.

"High school reporting is different from covering college or professional sports," the sports writer told us. "And high school reporting can go underappreciated, but it's extremely important. You're covering a community."

"You're not writing for as big of an audience, but you're writing for a very, very specific one," they added. "Family members — uncles, parents, people who care that your story has their kids' names. They're looking for keepsakes, things they can remember from their kids' high school career."

To be clear, meaningful high school sports journalism is alive and well, in Gannett papers and elsewhere. It's fall sports season, after all, and reporters across the country, from suburbia to sprawling rural regions, are taking to the stands to cover local matchups. These reporters know the players — who to watch, if they're planning to continue their athletic career in college, the history of the team and its coaches. They take the time to speak with the players and staffers, and all this context is infused into their reporting.

So in other words, it's good journalism, penned by human beings who were actually there. And the fact that this kind of high school sports writing still exists makes Gannett's recent foray into AI-generated high school sports roundups all the more infuriating.

For the uninitiated: earlier this week, Gannett came under fire when one of its regional papers, the Ohio-based Columbus Dispatch, was caught using a service called Lede AI to churn out dozens of AI-generated synopses of local sporting events — the quality of which were, in a word, bad. Every piece was bland and repetitive, with no sense of context or substance. The only snippets of useful information offered by the blurbs were the names of the teams involved — and that's if the AI remembered to include the right text, which it didn't always — and the final score.

"The quality was embarrassing," the Gannett writer said of the AI-generated posts. "It was something that would never — shouldn't ever — run in a paper. It wasn't great. It's not great."

Thankfully, these Lede AI-generated hits aren't running in print — yet. But they're all over the web, on Gannett-owned websites and on websites owned by various other local news publishers, large (Lee Enterprises) and small (American Community Journals.) And though Gannett has claimed that it's paused its AI effort — which wasn't just in practice at the Columbus Dispatch, but in numerous of its other publications, including USA Today — in the wake of widespread backlash, Lee, ACJ, and others are still happily churning out an alarming quantity of Lede AI-generated drivel.

"In addition to adding hundreds of reporting jobs across the country, we have been experimenting with automation and AI to build tools for our journalists and add content for our readers," the company told us in a statement. "We have paused the high school sports LedeAI experiment in all local markets where they were published and will continue to evaluate vendors as we refine processes to ensure all the news and information we provide meets the highest journalistic standards."

The AI-generated coverage was also only used to populate stories about high school sports specifically, a decision that feels a bit ominous — and quite sad, for the medium of high school sports writing and for the communities that the writing is in service of.

"I think it's kind of ironic, this idea that high school sports reporting is kind of unimportant, or easily replaceable," said the Gannett writer. "I'll admit it: it isn't hard news. But that doesn't mean it doesn't require a specific voice and style and readability. And it doesn't mean it doesn't require all the same fundamentals of journalism needed when you're covering, say, a city council meeting."

"When it's done right, high school sports reporting is a community service. It has to hit really a local, intimate angle," they added. "It's ironic that [one of the] biggest AI experiments would come covering high school sports, where the coverage really has to be more personal."

The Gannett writer did note that they don't think their job is necessarily at risk. A machine can't travel to Friday night football games and talk to players, and it also doesn't have a relationship with its community. They also noted that the kind of content that the AI is producing isn't something that really exists in local papers; they might list scores, but the stories that the papers run will focus in more detail on certain games, or profile certain players.

But "there aren't a lot of high school sports reporters anymore," they told us, and "a lot of high school reporters now are pretty thinly stretched." And this in mind, the kind of quick-hit blurbs that Lede AI is producing seems less like a mission to overtly replace their human journalists, and more like a means to Frankenstein a way to get more clicks to a newspaper's digital site while covering — if terribly — more ground.

While it makes sense in C-suite theory, however, the ROI is questionable. Gannett's foray has resulted in a flurry of bad press, across social media and in the news, seemingly symbolizing the worst that AI-powered "journalism" has to offer. And as a direct result of publishing this material, there seems to be a distinct diminishing effect on the rest of the good, human-made work that a local newspaper is providing to its region.

It brings CNET's decision to use AI to automate financial explainers to mind. Sure, on the one hand, this kind of writing is often tedious, and is really designed for evergreen SEO clicks more than anything else. But consumers turn to financial explainers for a reason: they're looking for helpful information, and if that content isn't held to the same journalistic standards as the rest of a site's material, the reader is ultimately the one who loses out.

The decision to automate also feels very much like a band-aid-for-bullet-wound solution to larger issues in local news across the nation. Since last May, Gannett and Lee Enterprises have both suffered massive layoffs, as have a vast swath of digital publishers, many of whom have made a move to AI in tandem with workforce cuts.

In most cases, a company's leadership isn't overtly ousting a specific human journalist for a specific automated stand-in. What it is doing, though, is attempting to use AI to plug bigger and more fundamental holes in an ever-failing business model. But when the content that a company's AI is producing is as absolutely terrible as the stuff that Gannett and others have been churning out, it might just make those holes that much bigger.

"Papers continue to downsize, downsize, downsize," said the Gannett journalist, "and when there are a few reporters for more teams, we're going to get things like AI."

And "that concerns me," they said.

More on AI and high school sports journalism: USA Today and Many Other Newspapers Are Churning Out Terrible AI-Generated Sports Stories


Share This Article