Google is testing a tool that uses AI to write news stories and has started putting it into publications, according to new report from the New York Times. The tech giant sent the AI tool to The New York Times, The Washington Post and The Wall Street Journal owner, News Corp.
The tool, code-named internally “Genesis,” can take in information and then generate news copy. Google believes the tool can act as a personal assistant for journalists by automating some tasks to free up time for others. The tech giant sees the tool as a form of “responsible technology.”
The New York Times reports that some executives focused on the tool found it “upsetting,” noting that it seemed to ignore the effort made to produce accurate news stories.
“In partnership with news publishers, particularly smaller publishers, we are in the earliest stages of exploring ideas to provide AI-enabled tools to help journalists with their work,” a Google spokesperson said in a statement to TechCrunch.
“For example, AI-enabled tools could help journalists with options for headlines or different writing styles,” the spokesperson said. “Our goal is to give journalists the choice to use these emerging technologies in a way that improves their work and productivity, just as we’re providing helpful tools for people in Gmail and Google Docs. Simply put, these tools are not intended to, and cannot, replace the essential role journalists play in reporting, creating and vetting their articles.”
Some news organizations, including The Associated Press, have long used AI to generate stories for things like corporate earnings, but these news stories represent a small fraction of the organization’s total articles, which are written by journalists.
Google’s new tool is likely to cause concern, since AI-generated articles that aren’t properly vetted or edited could spread misinformation.
Earlier this year, American media website CNET quietly began producing articles using generative AI, as such ended up backfiring for the company. CNET ended up having to issue corrections to more than half of the AI-generated articles. Some of the articles contained factual errors, while others may have contained plagiarism. Some of the website’s articles now have an editor’s note reading, “The AI engine contributed to an earlier version of this article. This version has been substantially updated by a staff writer.”