Published:  08:57 AM, 10 October 2025

How ChatGPT Is Reshaping Journalism and Raising Alarms

How ChatGPT Is Reshaping Journalism and Raising Alarms

 Pranto Chatterjee

It has been less than two years since ChatGPT entered the public consciousness, yet its influence on journalism has been seismic. Newsrooms across the world are wrestling with a new reality: a machine that can write, translate, summarize, and even imitate the cadence of professional reporters. At first glance, this looks like a godsend. Imagine cutting newsroom costs, automating tedious briefs, and pushing out stories in record time. Reuters, for instance, has been using automation to publish thousands of corporate earnings reports each year, while The Associated Press has leaned on AI for transcription and data-heavy news since 2014. But here is the paradox. What seems like a revolution in efficiency could easily morph into journalism’s slow decline. Because if the very foundation of journalism rests on human accountability and judgment, how do you trust a black-box algorithm that cannot explain why it wrote what it wrote?

Let us start with the obvious advantages. ChatGPT can accelerate newsroom workflows in ways no intern or junior reporter could dream of. According to a study by Sidorenko and colleagues (2024), between December 2022 and late 2023, more than 18,000 news items globally referenced ChatGPT, reflecting not just industry buzz but a measurable reshaping of media routines. Data-heavy tasks such as parsing court transcripts or analyzing climate studies, once reserved for hours of human labor, can now be done in minutes. Local papers with shrinking staff can suddenly cover town hall meetings with AI-generated summaries. Translation is another game changer. BBC and Al Jazeera have experimented with using generative AI to translate and localize stories into dozens of languages, opening access to audiences who were previously excluded by linguistic barriers. In regions with limited media resources, this can feel like democratization at scale.

The economic case is equally compelling. Media is a business. Cost efficiency is no small thing when advertising revenues are shrinking and readers are reluctant to pay for subscriptions. Some outlets report savings of up to 30 percent on social media and newsletter production when using AI. For nonprofit outlets, that money can be redirected into investigative journalism, the kind that still requires shoe leather and FOIA requests. Data journalism too is gaining muscle. From Reuters visualizing climate trends in real time to the Texas Tribune mapping funding disparities across schools, AI is amplifying journalistic capacity to parse complexity.

But if you feel excited, pause for a moment. Ask yourself this: when was the last time you read a piece and wondered if a human actually wrote it? What happens when an entire industry begins to outsource not just data crunching but storytelling itself? Journalism is not merely about producing words. It is about responsibility. It is about explaining why a story matters, who was consulted, and what was left out. An AI can generate 1,000 words on a protest within seconds, but can it tell you whether it smelled the tear gas or noticed the trembling in a mother’s voice as she described losing her son? The truth is, no. Empathy and accountability cannot be outsourced.

Accuracy is another fault line. CNET tried publishing AI-generated finance articles, only to retract dozens when errors—some laughable, some serious—surfaced. AI “hallucinations,” the polite term for making things up, remain rampant. Studies confirm that ChatGPT struggles with real-time data, numbers, and citations. It might confidently assert that a law exists when it doesn’t. It might attach a respected journalist’s name to a fabricated piece. In journalism, where credibility is currency, such flaws are lethal. A factual error from a reporter can be traced back to sources, to notes, to a process. A factual error from an AI? Good luck demanding an explanation from a black box.

Bias is an equally insidious threat. Since generative AI is trained on oceans of internet text, it reflects the prejudices embedded in that data. Gender stereotypes, racial biases, political leanings—all of it seeps into the model’s outputs. When Bloomberg tested AI on gender pay gap stories, the system downplayed structural inequality, mirroring corporate narratives. Left unchecked, this risks cementing systemic distortions as neutral “truths.” If journalism’s role is to challenge power, how can it fulfill that duty when the tool itself parrots bias from entrenched powers?

And then comes the existential threat. Journalism thrives on originality, context, and analysis—qualities AI currently lacks. As Financial Times’ first AI editor, Madhumita Murgia, bluntly noted, “It is not original. It is not breaking anything new.” Machines can remix the past, but only humans can unearth the future. Investigations that topple governments or expose corruption will not emerge from predictive text engines. Yet, news companies tempted by cost savings may prioritize quick AI content over deep human reporting. Already, layoffs at BuzzFeed and Insider have coincided with AI adoption. Do we risk a world where algorithms replace thousands of journalists, and in doing so, erase the very guardians of democracy?

This is not just about jobs. It is about the integrity of the information ecosystem. Think of the flood of AI-generated content already populating the internet. Some experts predict that by 2026, 90 percent of online content could be machine-generated. How will the public distinguish authentic reporting from synthetic spin? If journalism becomes indistinguishable from algorithmic chatter, the cries of “fake news” from the past decade will seem almost quaint. Without credibility, news institutions collapse. Without institutions, democracy itself teeters.

So what is the path forward? The most thoughtful voices argue for a hybrid model. AI should serve as assistant, not author. The Associated Press, for example, insists all AI outputs be reviewed by humans before publication. NBC has built frameworks for responsible use, ensuring transparency when AI is involved. UNESCO is drafting global ethical guidelines for AI in media. Some outlets already disclose when a bot contributed to a story. These steps matter because they preserve accountability—the lifeblood of journalism.

Yet, regulation and newsroom policies alone may not be enough. Perhaps we must return to first principles. Journalism is not just the craft of writing. It is an act of trust. Readers grant journalists legitimacy not because they write beautifully, but because they can be held accountable. A journalist can be questioned about sources, motives, and omissions. A machine cannot. That single fact should give us pause. If something did not come from a human mind, can we still call it journalism?

Of course, one could argue that journalism has always adapted to new technologies. The printing press disrupted scribes. The radio rattled newspapers. The internet dismantled classifieds. Each time, the field survived by reasserting its human value: context, analysis, and judgment. Maybe this is just another chapter in that story. Or maybe, just maybe, ChatGPT represents a rupture unlike any before. After all, never before has technology produced words—our very raw material—so convincingly human.

Consider this carefully. Will we embrace ChatGPT as a tool that strengthens journalism, or will we allow it to erode the profession until it becomes indistinguishable from content marketing and propaganda? The answer lies not in the machine, but in us. If we demand transparency, accountability, and originality, AI will remain an assistant. If we chase efficiency at all costs, we risk hollowing out the soul of journalism itself.

ChatGPT is not the end of journalism. But it could be the end of journalism as we know it. The stakes could not be higher. In a world drowning in information, the last thing we can afford is to lose the human voice that helps us make sense of it all.


Pranto Chatterjee is an Electrical Engineer and is currently
pursuing an MSc in Autonomous Vehicle Engineering at the University of 
Naples Federico II in Italy.



Latest News


More From OP-ED

Go to Home Page »

Site Index The Asian Age