Impakter
  • Environment
    • Biodiversity
    • Climate Change
    • Circular Economy
    • Energy
  • FINANCE
    • ESG News
    • Sustainable Finance
    • Business
  • TECH
    • Start-up
    • AI & Machine Learning
    • Green Tech
  • Industry News
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
  • Editorial Series
    • SDGs Series
    • Shape Your Future
    • Sustainable Cities
      • Copenhagen
      • San Francisco
      • Seattle
      • Sydney
  • About us
    • Company
    • Team
    • Global Leaders
    • Partners
    • Write for Impakter
    • Contact Us
    • Privacy Policy
No Result
View All Result
Impakter logo
No Result
View All Result
AI in Journalism

AI in Journalism and Democracy: Can We Rely on It?

GenAI tools are reshaping the information environment in ways most audiences never see. From the data that trains them to the labour that maintains them, their inner workings raise urgent questions for journalism and democratic accountability

Dr Jake Goldenfein, University of MelbourneDr Fan Yang, University of MelbourneDaniel Angus - Professor at the Queensland University of Technology & Director of its Digital Media Research CentrebyDr Jake Goldenfein, University of Melbourne,Dr Fan Yang, University of Melbourneand1 others
November 26, 2025
in AI & MACHINE LEARNING, Politics & Foreign Affairs, TECH
0

Our world is in the midst of a disruption triggered by the development of Artificial Intelligence (AI). Companies selling AI tools have become the most valuable corporations in modern times, worth trillions of dollars — more than the GDPs of most countries. They are becoming a pervasive influence on social, commercial, and political life, and shaking up industries.

The media industry is among those facing new kinds of challenges due to the rise of AI. The practice and delivery of journalism, which is a vital component for functioning and healthy democracies, is changing in ways that are not obvious to its consumers.

To understand the impact of AI on our information environment and its political consequences requires a basic understanding of what GenAI is and how it works. We need to “lift the bonnet” on what will increasingly power the information we receive and consume.

Data: The engine powering generative AI

The development of GenAI begins with collecting vast amounts of data — including text, images, videos, and sounds — by crawling and scraping the internet. Everything from journalism, academic outputs, the public web, and text chats is gathered as data. This is bolstered by compilations of literature accessed, not always legally, through commercial licensing arrangements with media repositories.

The legitimacy of these forms of data collection is still unclear and has led to high-profile copyright and privacy litigation around the world. It has also triggered policy and regulatory debates about the legal conditions for accessing data, and loud complaints from creatives whose labour has become the basis of the vast revenues of the new multinational AI tech firms.

For these AI technologies, access to data itself is not enough. The data has to be converted into training datasets that involve a range of different kinds of computational processes and human labour. To make data meaningful in AI training, data workers have to label, clean, tag, annotate and process images and text, creating semantic links that enable GenAI models to produce meaningful responses to user “prompts.”

Much of this data work is outsourced to lower-cost countries such as Kenya, India and China where workers are paid low wages and face poor labour standards. Those datasets are then used to train AI models through the process of machine learning.

Lifting the veil on how generative AI works

Machines don’t learn like humans do. What we call “machine learning” is essentially a process of statistical pattern recognition. While there are many differing approaches to model training, in most cases it involves successive adjustments to vast numbers of internal values. This process is iterative, meaning the training repeats until the predictions are sufficiently close to the expected results.

Once trained, models like those that power ChatGPT, when prompted to, for example, “write a short news story on inflation figures,” can generate what is known as a sequence of tokens (word fragments) that statistically resemble similar stories seen during model training.

Critically, systems such as ChatGPT do not understand the world they depict or describe. They do not possess semantic knowledge, meaning they can’t understand facts or concepts such as what “inflation” means or what a “street protest” looks like. Instead, the machines are pattern-modelling engines that predict what content would most plausibly complete or correspond to a given prompt. In sum, the AI output is simply a function of scale and training data — not comprehension.

Related Articles

Here is a list of articles selected by our Editorial Board that have gained significant interest from the public:

  • To AI or Not to AI in Academia?
  • Can Government Efforts to Regulate AI in the Workplace Make a Difference?
  • Recognising and Embracing AI in Research

What does generative AI mean for journalism?

The predictive capacity that makes generative AI powerful also makes it unreliable. Prediction is not verification. These systems fill gaps with what sounds or looks right, not necessarily with what is right.

The generative AI model can in seconds write with fluency, summarise lengthy reports, or rephrase complex passages. It can produce images of events that appear photorealistic. But those outputs are the products of machine predictive tools — not verification. When AI is trained on biased or incomplete data, it is known to “hallucinate” content that looks and sounds right, but is inaccurate or unreliable.

That distinction matters profoundly for journalism, which depends on truth and verification rather than plausibility.

For journalists and audiences alike, the risk lies in not being able to verify AI-generated content. As ever more AI-generated content is pushed into the information ecosystem without clear labelling or context, it contributes to a media environment where the difference between reporting and simulation, and between fact and fabrication, becomes increasingly difficult to discern.

Journalism’s future will depend on whether institutions can adapt to, and meaningfully govern, the use of AI. That means not only developing new editorial standards and verification practices, but also putting much greater effort into ensuring that the data, labour, and energy sustaining these systems is made visible and accountable.

The question is not whether AI will reshape journalism. It already has. The question is whether it is possible for  democratic societies to prevent AI from undermining trust in public institutions.

For those of us concerned about where our information and journalism comes from (its provenance), our ability as humans to check and verify information cannot match the lightning speed of chatbots to spit out dodgy text, data and images. Unless we humans can develop protocols and methods to regain control, oversight and checks before sharing machine outputs, we face the further erosion of the bedrock of society – agreed facts to allow rational thinking and consequent behaviours.

** **

This article was originally published by 360info™.


Editor’s Note: The opinions expressed here by the authors are their own, not those of Impakter.com — Cover Photo Credit: Fardad sepandar

Tags: AIartificial intelligenceChatGPTDemocracyGenAIGenerative AIInformation EnvironmentJournalismmedia industry
Previous Post

New GHG Protocol Structure Plans to Expand Reporting Beyond Scope 1–3

Next Post

One Health: Silo Barriers to Implementation and How to Overcome Them

Related Posts

AI data centres
AI & MACHINE LEARNING

The Cloud We Live In

How AI data centres affect clean energy and water security As the holiday season begins, many of us are engaging...

byAriq Haidar
December 24, 2025
A crowded airport terminal with travelers moving through check-in areas during the holiday season.
AI & MACHINE LEARNING

How AI Is Helping Christmas Run More Smoothly

Christmas this year will look familiar on the surface. Gifts will arrive on time, supermarkets will stay stocked, airports will...

byJana Deghidy
December 22, 2025
Can Government Efforts to Regulate AI in the Workplace Make a Difference?
AI & MACHINE LEARNING

Can Government Efforts to Regulate AI in the Workplace Make a Difference?

An overview of AI regulations and laws around the world designed to ensure that the technology benefits individuals and society,...

byRichard Seifman - Former World Bank Senior Health Advisor and U.S. Senior Foreign Service Officer
December 18, 2025
How Climate Change Could Help Foster Peace in Yemen
Climate Change

How Climate Change Could Help Foster Peace in Yemen

Yemen's tragedy is traditionally depicted through the limited perspective of humanitarian need and political divisiveness, but there is a greater...

byTareq Hassan - Executive Director of the Sustainable Development Network Canada (SDNC)
December 17, 2025
PRA cuts 37 reporting templates for UK banks; EU Lawmakers Agree to Slash Sustainability Reporting and Due Diligence Requirements; Projects in fast paced sectors could receive exemptions from environmental impact assessments.
Business

Ease of Reporting Standards for UK Banks

Today’s ESG Updates PRA to Ease Reporting for UK Banks: Prudential Regulation Authority has agreed to remove 37 reporting templates...

byPuja Doshi
December 12, 2025
Governments Are Hiding Data, Threatening Democracy. Here’s How It Affects You
Politics & Foreign Affairs

Governments Are Hiding Data, Threatening Democracy. Here’s How It Affects You

Around the world, governments are quietly deleting, manipulating, or withholding public data at an unprecedented scale, which is a direct...

byDaniel Angus - Professor at the Queensland University of Technology & Director of its Digital Media Research Centreand3 others
December 4, 2025
AI energy
Editors' Picks

For a Solution to AI’s Energy Crisis, Look at the Human Brain

As Artificial Intelligence (AI) races ahead, its capacities and limitations are now being computed by those at the forefront of...

byDr. Subhrajit Mukherjee - Head of the Optoelectronic Materials and Device (OEMD) Lab at Shiv Nadar Institution of Eminence
November 28, 2025
Staving Off Dictatorship: Innovative Tactics and Strategies
Politics & Foreign Affairs

Staving Off Dictatorship: Innovative Tactics and Strategies

In a tale of three dates — ICE Operation Midway Blitz on September 30, No Kings Day protests on October...

byDr. Annis Pratt
November 26, 2025
Next Post
One Health: Silo Barriers to Implementation and How to Overcome Them

One Health: Silo Barriers to Implementation and How to Overcome Them

Recent News

Trump’s ‘Blockade’ of Venezuela: A Dangerous Global Precedent?

Trump’s ‘Blockade’ of Venezuela: A Dangerous Global Precedent?

December 25, 2025
Solid-State Batteries: The Bet Promising to Change Electric Vehicles

Solid-State Batteries: The Bet Promising to Change Electric Vehicles

December 25, 2025
coal mine

Can the War on Coal Still Be Won?

December 24, 2025
  • ESG News
  • Sustainable Finance
  • Business

© 2025 Impakter.com owned by Klimado GmbH

No Result
View All Result
  • Environment
    • Biodiversity
    • Climate Change
    • Circular Economy
    • Energy
  • FINANCE
    • ESG News
    • Sustainable Finance
    • Business
  • TECH
    • Start-up
    • AI & Machine Learning
    • Green Tech
  • Industry News
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
  • Editorial Series
    • SDGs Series
    • Shape Your Future
    • Sustainable Cities
      • Copenhagen
      • San Francisco
      • Seattle
      • Sydney
  • About us
    • Company
    • Team
    • Global Leaders
    • Partners
    • Write for Impakter
    • Contact Us
    • Privacy Policy

© 2025 Impakter.com owned by Klimado GmbH