Despite the rise of ChatGPT and its ability to generate text, journalists remain indispensable. Prove me wrong – without using an AI-driven content generator.
I already made amends when I switched from radio and newspaper reporting to online media, from reporting for local audiences to writing for international ones, from thinking of metaphors to include in headlines to figuring out how to please Google’s algorithm.
All I see in the ChatGPT era is hope for journalism: a growing need to verify information and determine which are the trustworthy media outlets, given the exponentially growing scope of misinformation.
But, having spent nearly two decades in journalism, I am inevitably biased. And so I turned to a couple of experts in different fields, and asked them for their opinions on the topic.
Get your facts straight
Many experts have sounded alarm bells about the scope of misinformation in the generative AI era. I’ve witnessed no few media outlets grabbing unverified images or other content from Twitter and Telegram, and simply reproducing them for bigger audiences.
While those articles might have disclaimers that a certain newsroom wasn’t able to independently verify the given piece of information, many people are reading only headlines, meaning what’s inside the article doesn’t really matter to them.
Some hoaxes, like the Balenciaga pope images of the pontiff wearing a puffer jacket, are just there for laughs and giggles. But bad actors behind the scenes might also have a political agenda, making fact-checking even more crucial for journalists.
“It’s not just journalists falling for fake news, it’s everyone. It’s a huge issue that I hope will be a positive outcome from the world adapting to AI,” Jenny Bloom of Jive PR + Digital told Cybernews.
Bloom believes there’s an opportunity here – the media can adapt these tools to cross-reference information and verify accuracy.
“However, I have no doubt that journalists will always need to connect with reliable sources on a human level for gathering facts, quotes, and intimate details only one can share from their own experience when storytelling,” she added.
Journalism isn’t just about writing
When I was applying to study journalism, I remember many other applicants saying “I’ve always loved writing” when asked why they wanted to study journalism.
ChatGPT and similar tools are now threatening to replace those writers who just write for the sake of writing.
The package of the story, no matter the medium – written, audio, video or multimedia – is, of course, very important as it “sells” the story. Luckily, it’s not at the heart of journalism.
The story itself, however, is.
“While language models will be able to generate written content, including news reports, blogs, and summaries, it will not take the place of humans who are on the ground gathering facts and interviewing sources,” Richard Gardner, CEO at cybersecurity company Modulus, told Cybernews.
AI is not sentient (yet). It can’t look a politician into an eye and ask uncomfortable questions. It can’t make governments and companies accountable for their actions. Journalists can.
I’ve been relying on digital tools throughout my whole career. I no longer transcribe my interviews myself, I use spell-check option, I’ve toyed with Grammarly a lot. I don’t use ChatGPT to write my articles, but what if I would?
I do love writing, but for some this task is just as mundane as interview transcribing is to me. So why not give those tasks to “robots” and focus on what’s really important?
Whom do we really trust?
People aged under 30 nowadays trust information they get from social media almost as much as that from national news outlets, research by the Pew Research Center found. In the US, only 26% of adults trust news media most of the time.
Well, you know what? I don’t trust the media in general either. But I do trust specific outlets and journalists, who’ve built their authority over time, spending enough of it to fact-check information while also, most importantly, admitting to their mistakes.
“Who would you trust more as a whistleblower – an authoritative investigative journalist with years’ worth of proven integrity, or a talking digital black box?” Denis Khoronenko, PR manager at EoT Labs, an open-source development company, told Cybernews.
Gardner shares my opinion that AI tools will free journalists from mundane daily tasks and give them time to expand investigative journalism, requiring “critical thinking, creativity, and relationship networks, among professionals.”
“The success of personality-driven projects like Joe Rogan’s podcast as well as all the household names across the media spectrum show that humans like personality, a human touch to a story,” Khoronenko said. “They relate to other humans, their unique experiences and deep emotions. It’s hard to see a digital model, trained to do believable chatter, delivering on that front.”
This reinforces my hope that journalists could thrive in an AI world. Of course, that depends on their embracing the AI revolution rather than dreading its coming.
Finally, in flagrant disregard of my own call to arms at the top of this article, I turned to ChatGPT for a concise answer on this difficult topic. Here’s the answer it gave me.
“Yes, we will still need journalists in the ChatGPT era for their critical thinking, creativity, and ability to provide independent, investigative, and diverse coverage,” the machine told Cybernews. “While AI can assist with some tasks, journalists offer valuable context, analysis, and human interest perspectives that AI cannot. Therefore, while some aspects of journalism may be automated, journalists will continue to play a crucial role in informing and engaging audiences.”
More from Cybernews:
Subscribe to our newsletter