Now there are horror stories about news publishers. Mass-produce error-filled AI-generated content is well known. Other uses of AI are helping journalists make headlines for the right reasons – such as The New York Times Using computer vision for satellite imagery To count bomb craters in Gaza.
But beyond the most high-profile wins and faceplants, a modest but impactful application of generative artificial intelligence is quietly transforming newsrooms around the world. Generative AI presents new opportunities to enhance the reporting process and the storytelling itself. From assisting with copy editing to surfacing insights buried under vast mountains of data, the latest breed of AI can, and in some cases already has, given journalists powerful tools to improve their craft. doing. And as technology evolves, it promises to reshape journalism and the news business with the potential to enhance accuracy, efficiency, and depth of coverage.
A 2023 survey of 105 news organizations in 46 countries found that the majority See the potential benefits for journalism with generative AI tools Like ChatGPT. In a survey by JournalismAI, a global initiative to keep news organizations informed about AI, nearly three-quarters of respondents said such AI applications would bring new opportunities to the field. did. Additionally, 85% of respondents, including journalists, technologists, and newsroom managers, have experimented with using AI for tasks such as creating images or generating article summaries.
Semafor, launched six weeks before ChatGPT, is one such experimental newsroom. The media startup uses two of his AI tools: an internal editing bot and MISO.The latter can help you find empowering stories signal, a breaking news feed curated by journalists and AI-generated AI, highlighting news from publications around the world. Semafor's approach to AI is to have a clear picture of AI's current capabilities and limit its use to those areas, said Gina Chua, Semafor's executive editor.
“Essentially, English majors can do a lot of things,” Chua said.
Beyond those tools, she Experimenting with what is known as search expansion generation, or RAG. This technology allows AI chatbots to generate large amounts of relevant information on the fly, prioritizing the data library provided by Chua. This approach helps combat so-called hallucinations, or events where generative AI models make things up.
Chua built the RAG tool using the Department of Justice's definition of a hate crime and fed the chatbot with specific scenarios, such as a white man attacking an Asian woman. The chatbot can label the call as a hate crime or not and explain the reason behind the call. She gave her second model the Trans Her Journalists Association's style and reporting guidelines. She then asked her AI to provide feedback about published articles and the level of misinformation about trans issues they contained. Both work well, and Chua considered building her RAG model to help Semafor's journalists navigate how to report on sensitive subjects and how to quickly understand the content of a story. I am thinking of doing so.
Chua built these chatbots in his spare time using easy-to-use tools. She said that most journalists should experiment with her AI and that in certain tasks she should not dismiss it as not being better.
“They're probably not as good as humans in some ways,” she said. “It's a mistake to say, 'I want this to be human.' The trick is to say, 'I want this to be as good a tool as possible, but how do I complement it, and how does it complement me? “It is to say.”
Chatbot drives subscriptions
The most common way people interact with generative AI is through chatbots such as: OpenAI's ChatGPT or google gemini. Newsrooms are also building chatbots for readers, such as his Skift, a business news website covering the travel industry.
Jason Clampett, Skift's co-founder and president, said Skift CEO Rafat Ali told an audience of travel experts, “This is going to be big, and we need to start working on it now. '', he recalls, when ChatGPT was introduced only a few weeks ago.
Clampet took that advice to heart, and Skift engineers quickly developed an AI assistant called .ask skift” The chatbot, trained on Skift's 30,000 news articles and reports, can also answer users' travel questions and suggest existing Skift articles for further reading.
“This is a great way for us to be able to find ways to cover stories,” Clampett said. “Another news site might just write about something as a knowledgeable observer. We can think, 'Oh, I see what the issue is here.' .”
Ask Skift currently answers thousands of questions each week. The chatbot works much like his Skift paywall, allowing users to ask him three questions for free before prompting them to become paid subscribers. Ask Skift's paywall already translates into annual subscriptions of about $365, Clampett said.
He said Skift monitors reader questions to identify trends and potential story ideas. This creates new coverage such as: Why have travel costs risen so much?.
“It's pretty much the same way you study Google Trends to find out what people are up to and what they're looking for,” Clampett said. “This is how we know this is what people want, and here's how we take this theme further.”
Ask Skift is just the company's first generative AI experiment. The engineering team also released an app that allows users to ask questions on his Slack, a widely used office chat platform. Clampett said technology is becoming easier to use every day and more ideas are being considered.
“A lot of it is just trial and error and figuring out how to get a little bit better each time,” he said.
“Impatient to get ahead”
High-profile failures in sports illustrated is just the tip of the iceberg. Nearly a dozen other news organizations have published AI-generated articles containing errors, and the issue has been extensively documented by tech site Futurism.
Felix M. Simon, a communications researcher and doctoral student at the Oxford Internet Institute, points out that failures like this are often caused by companies first deploying generative AI capabilities without conducting a thorough vetting process. He says this seems to be due to the fact that he is in a hurry to learn.
“What you see from the outside is that in each case, we were proactive and rushed to implement AI as quickly as possible,” he said.
Simon said journalist involvement was essential during production and before publication. The need for human oversight is also one of the reasons why journalists must learn how to work around AI tools.
“There's going to be a learning curve,” Simon said. “If you want to work with these systems in the first place and identify their strengths as well as their weaknesses, you need to get used to working with these systems.”
“AI is a tool. It's just a tool.”
While human oversight remains important, Simon cautioned against complacency about the potential for job losses to AI. That would not be a reason for journalists to be replaced in the short term. Deeper systemic issues, such as pressure to cut costs and a lack of sustainable business models, led to mass layoffs of journalists long before the advent of generative AI.
However, as AI technology advances, it may eliminate the need for human assistance, reducing the need for staffing even with current capabilities. Simon said management could also use AI to justify further job cuts.
Some worry that relying on big tech companies will exacerbate news organizations' already dire woes. As Rodney Gibbs, senior director of strategy and innovation at the Atlanta Journal-Constitution, wrote at the Nieman Institute late last year: Conversations about ChatGPT have echoes of the “transition to video” In the age of digital media, Facebook, many newsrooms have lost their way.
Nikita Roy, director of the AI Journalism Lab at the Craig Newmark School of Journalism, has heard this concern, but hasn't given it much thought. This time, newsrooms aren't relying on technology companies to build audiences, she says. Instead, newsrooms are the customers who buy the product, giving news organizations more power in that relationship.
Importantly, Roy said, AI tools are so easy to use that even small newsrooms can benefit. If used correctly, AI can benefit an industry that is constantly struggling.
“AI is a tool,” she said. “It's just a tool. But it's a tool that helps us do more work and reach a larger audience, and those are the two things we need to do.”