The value of journalism in the AI era – Mongabay

![]()
If you liked this story, share it with other people.
The arrival of generative artificial intelligence has unsettled journalism in familiar ways. Traffic models look fragile. Copyright is contested. The marginal cost of producing words has collapsed. It is tempting to conclude that news is becoming less valuable just as machines become more fluent.
So far for Mongabay, the opposite is happening.
AI systems are adept at rearranging existing information. They summarize, paraphrase, and answer questions at speed. What they cannot do is observe the world directly, make accountable judgments, or create new facts. Those limits are not incidental. They are built in. And they place journalism in a more central position than before.
This has become clearer to me in an unexpected way. At Mongabay, a nonprofit newsroom focused on environmental reporting, we chose not to block generative AI systems from accessing our work. Many publishers have done the opposite, citing copyright concerns, energy use, or fear of disintermediation. Those concerns are understandable, especially for commercial outlets whose business models rely on restricting access. But as a nonprofit focused on impact, our calculus is different. We already allow other outlets to republish our reporting as part of our impact strategy. If AI tools were going to answer questions about forests, fisheries, conservation, or biodiversity, it seemed better that those answers be informed by reported journalism than by thinner sources.
I assumed this would reduce traffic. If people could get what they needed from an AI interface, why would they click through?
That is not what happened. According to our analytics, ChatGPT has become one of our largest non-Google sources of traffic, reflecting users who click through from the chatbot interface to read the underlying stories. More strikingly, readers who arrive via chatbots – including ChatGPT, Perplexity.ai, Google Gemini, Microsoft copilot, and Claude.ai – spend significantly more time with our articles than those coming from other platforms. They appear to be checking the source, reading closely, and staying.
This is not because AI has suddenly learned to value journalism. It may be because users have. When answers are plentiful and frictionless, provenance starts to matter. Readers seem to want to know where claims come from and whether someone is accountable for them.
That accountability is the core asset of journalism in the AI era. Newsrooms verify events, document evidence, and attach names and institutions to what they publish. In a world of synthetic text, images, audio, and video, verification is no longer just an internal process. It is what readers are actually looking for. As the quick visual and stylistic checks that once helped people spot AI-only content break down, that desire for accountability pushes them toward trusted institutions and clear sourcing.
Judgment also becomes more visible: AI systems optimize for likelihood and relevance, drawing on patterns in past data, while journalism decides what deserves attention in the first place. It chooses which harms to investigate, which voices to elevate, and which silences are unacceptable. These choices reflect values and public responsibility. Human judgment is imperfect, but it is accountable—anchored in ethics, transparency, and editorial review that models struggle to replicate. Our experience may not generalize to every newsroom, but it hints at a broader shift in how people navigate information. If fully machine-generated material keeps improving, transparency and editorial oversight will matter even more—not as abstract ideals, but as practical ways for audiences to evaluate what they are seeing.
Original reporting matters for the same reason. Investigations, field reporting, and beat expertise generate information that did not previously exist. AI systems depend on such work, even as they obscure that dependence. Without a steady supply of new reporting, the models have little new to learn from other than synthetic data.
Trust, meanwhile, grows scarcer. Credible news organizations explain how they know what they know and correct errors in public. AI systems are largely opaque. As uncertainty rises, transparency becomes a competitive advantage.
None of this means journalism is insulated from disruption. AI will reshape workflows, distribution, and revenue. But so far it seems to be helping clarify the role of news rather than erasing it.
When language is abundant, truth becomes harder to locate. That makes journalism—and the institutions that practice it—more necessary, not less. The task then is to keep producing reporting that earns that trust, and to make its value visible in a landscape crowded with low-quality AI outputs.
This is an expanded version of a piece I published on my blog on December 14, 2025.
Header image: Background photo of a rainforest in Brunei with an icon superimposed over it. Photo by Rhett Ayers Butler.
A decade after countries agreed to the Paris climate agreement, Mongabay reports on an idea often invoked when discussing Africa’s path toward a low-carbon future: a just energy transition. Reporters from across the continent explore what “just” and “clean” energy mean for Africans. These stories show African countries are pursuing their own journeys toward more […]
© 2025 Copyright Conservation news. Mongabay is a U.S.-based non-profit conservation and environmental science news platform. Our EIN or tax ID is 45-3714703.
you're currently offline