Google's foray into AI-generated journalism — and maybe-possibly destroying the already-teetering media industry along the way — continues.

Adweek reports that Google is paying a select group of publishers to quietly test a secretive generative AI platform designed to produce news articles. Per the agreement, Adweek writes, "the publishers are expected to use the suite of tools to produce a fixed volume of content for 12 months" in exchange for a "monthly stipend amounting to a five-figure sum annually."

According to Adweek, the in-beta AI tool allows "under-resourced publishers" to "create aggregated content more efficiently by indexing recently published reports generated by other organizations, like government agencies and neighboring news outlets, and then summarizing and publishing them as a new article."

In other words, it sounds an awful lot like the AI program is explicitly designed to vacuum up the work of other news providers and recapitulate it into new material for publishing — and paying struggling publishers chump change to promote it.

In a statement to Adweek, Google claimed that the effort is still in its "early stages" and defended the AI as a way to "help" news organizations, "especially smaller publishers." But backdropped by its efforts to integrate journalism-regurgitating AI into its Search algorithms — which are simultaneously eroding due to the increasing online ubiquity of poor-quality AI-generated content — the tool feels much less like a means of helping the limping journalism industry along, and much more like yet another AI-generated nail in its coffin.

According to Adweek's report, Google specifically asked publishers participating in the AI test to provide a list of government agencies and fellow news organizations that create news deemed relevant to a given participants' readership. When sites on the list publish something new, it'll automatically appear in the program; the AI will then paraphrase that material, "altering the language and style of the report to read like a news story," according to Adweek.

Notably, the agencies and publications actually doing the legwork to create that source material never provided consent, nor are they directly compensated — an imported detail, considering that a general lack of a compensation model for journalists and publications whose work appears in Google's AI-generated outputs remains a common criticism of the company's AI search efforts.

The tool also reportedly doesn't require publishers to mark content as AI-generated, meaning that content created by the program may have already been published sans AI disclosures.

But Google, for its part, has denied the claim that its tool rips off the work of other journalists.

"This speculation about this tool being used to re-publish other outlets' work is inaccurate," a Google spokesperson told Adweek in a statement. "The experimental tool is being responsibly designed to help small, local publishers produce high quality journalism using factual content from public data sources — like a local government's public information office or health authority. These tools are not intended to, and cannot, replace the essential role journalists have in reporting, creating and fact-checking their articles."

Again, though, this mostly feels like lip service. Google already has tools that can alert journalists to content relevant to their reporting, like Google Alerts — tools that, importantly, don't function by paraphrasing others' published work. And based on Adweek's summary of the tool, the secretive AI service seems to sit firmly in the Sam Altman Line of Creativity Reasoning, or the belief that creative human labor is basically just algorithm-like mental remixing of the creativity we've already consumed. Or, in short: we're all just doing word and picture math with the absorbed data clunking around in our feeble mortal brains!

Adweek's report didn't name any of the publications testing the product. But it's not surprising to see Google focus its attention on local news publishers, undeniably important institutions that have unfortunately been struggling for some time now. That said, this wouldn't be the first time that small, local outlets have turned to AI as a way to skirt labor costs while generating more material.

If local journalism's path forward in a dwindling market is indeed just using algorithms to churn out content at the expense of other news providers as a way to cut costs, though — well, that's a deeply depressing vision for the institution's future.

More on AI and local news: Gannett Sports Writer on Botched AI-Generated Sports Articles: "Embarrassing"


Share This Article