“The town of Arlington is practically a news desert,” said Winston Chen, a resident and software developer. So he teamed up with David Trilling, a fellow Arlington resident and veteran foreign correspondent, to start Nano Media, a nonprofit company building a digital toolkit to automate field reporting. If the system proves its worth in Arlington, Chen and Trilling plan to bring their technology to news-hungry communities across the country at a low cost.
“We want to make the technology so scalable that anyone can raise their hand and say, 'I want to start this kind of news organization in my town,'” Chen said. “We call it local news in a box.”
This is an unsettling thought for human journalists, who have already seen digital technology ravage the industry. Since 2005, about 2,500 U.S. newspapers have closed, most of them in suburbs and small towns, according to Northwestern University's Medill School of Journalism. Pew Research Center estimates that 57% of newspaper reporters lost their jobs between 2008 and 2020, largely because newspapers lost billions of dollars to Google and other online giants. That's the loss of advertising money.
But it's unclear whether computer-generated news stories will cost newspapers more jobs. After all, this technology is not new.
For years, the news agencies Reuters and Associated Press have been delivering large amounts of computerized news, primarily about simple news such as high school and college sporting events and corporate financial reports. News agencies primarily use this technology to free up human journalists to cover more complex and demanding stories.
These programs typically work by ingesting raw data from companies and sports teams, such as box scores and company press releases. We do not use well-known AI programs such as ChatGPT to generate stories. Instead, these systems combine pre-written human text templates with statistical information to make the results easier to read.
It doesn't always work out. Lede AI, an Ohio company that provides AI tools for producing local sports stories, said last month that its client newspapers in various cities had written sports stories that repeatedly used the same phrases, such as “close athletes.” It became a national laughing stock. Kind. ” The Gannett newspaper chain, which was the subject of a number of bizarre articles, announced it had “suspended” its use of the Lede AI system.
“As with any new technological advancement, some glitches may occur,” said a statement released by Lede AI. “We immediately began working around the clock to fix the issue and made the appropriate changes.”
Inside Arlington takes technology one step further. Track the city's news using software that regularly scans the government's website to get the latest reports. Also, monitor the city's YouTube channel for video recordings of public meetings. Then, use audio-to-text transcription software to generate a written record of each meeting. Suddenly, there was no need to send reporters to school board meetings. Because computers can cover it.
“I don’t think much of this would have been possible without COVID,” Trilling said. Because it was the pandemic that led many cities to hold public meetings online and allow computers to listen.
After capturing the text, Inside Arlington utilizes ChatGPT software to create a summary. Humans double-check the results to eliminate obvious errors. “All the editor has to do is go in, read it, make sure there's nothing embarrassing and press 'publish,'” Trilling says.
But is the resulting article really journalism? Former journalism professor Jeff Jarvis has his doubts. Jarvis, co-founder of Entertainment Weekly magazine and author of The Gutenberg Parenthesis, warns that AI software often creates narratives that are unrelated to reality. For example, his technology news site CNET had to post corrections to dozens of AI-generated articles earlier this year that contained obvious factual errors.
Jarvis warned that AI systems have no concept of truth. “It would be irresponsible to link that to an activity where facts and truth are expected.”
Ethan Zuckerman, director of the Digital Public Infrastructure Initiative at the University of Massachusetts Amherst, warns that there are even more difficult questions. A competent human journalist understands his subject matter, an AI program does not. They simply use statistics to string words together to create a story devoid of context or human insight.
“Trying to make meaningful journalism out of this situation is one of the most difficult challenges we can imagine,” Zuckerman said.
For example, if a city official says something shocking or controversial during a public hearing, a human reporter will highlight it in the article. The AI won't notice that and will just write a bland meeting summary like “a pile of unseasoned broccoli,” Zuckerman said.
Sure enough, the story of Inside Arlington is just a litany of condensed facts with no added trace of insightful analysis, historical context, or wit. In short, this site provides deadly boring reading.
Still, it provides Arlington residents with useful information about city government, information that is otherwise very difficult to obtain.
Chen and Trilling say they recognize the limitations of AI. Serious journalism requires in-depth research and human interaction that go beyond the capabilities of computers. “At the end of the day, we will still need full-time journalists,” Chen said.
But probably not that much.
Hiawatha Bray can be reached at hiawatha.bray@globe.com.follow him @GlobeTechLab.