Impakter
  • Environment
    • Biodiversity
    • Climate Change
    • Circular Economy
    • Energy
  • FINANCE
    • ESG News
    • Sustainable Finance
    • Business
  • TECH
    • Start-up
    • AI & Machine Learning
    • Green Tech
  • Industry News
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
  • Editorial Series
    • SDGs Series
    • Shape Your Future
    • Sustainable Cities
      • Copenhagen
      • San Francisco
      • Seattle
      • Sydney
  • About us
    • Company
    • Team
    • Global Leaders
    • Partners
    • Write for Impakter
    • Contact Us
    • Privacy Policy
No Result
View All Result
Impakter logo
No Result
View All Result
AI’s Threat to Humanity Rivals Pandemics and Nuclear War, Industry Leaders Warn

AI’s Threat to Humanity Rivals Pandemics and Nuclear War, Industry Leaders Warn

It is a truism: AI has immense potential but comes with significant risks. What is AI really capable of and what are governments doing to mitigate the risks?

Alina LiebholzbyAlina Liebholz
June 2, 2023
in AI & MACHINE LEARNING, Society, Tech
0

The Center for AI Safety released the following statement on its webpage: “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”

We’ve released a statement on the risk of extinction from AI.

Signatories include:
– Three Turing Award winners
– Authors of the standard textbooks on AI/DL/RL
– CEOs and Execs from OpenAI, Microsoft, Google, Google DeepMind, Anthropic
– Many morehttps://t.co/mkJWhCRVwB

— Center for AI Safety (@ai_risks) May 30, 2023

Among the signatories of the statement are Sam Altman, chief executive of ChatGPT-maker OpenAI, Demis Hassabis, chief executive of Google DeepMind, Dario Amodei of Anthropic and the so-called godfathers of AI: Dr Geoffrey and HintonYoshua Bengio.

According to the Center of AI Safety, some of the most significant risks posed by AI include the weaponisation of AI technology, power-hungry behaviour, human dependence on machines, as shown in the Disney movie Wall-E, and the spread of misinformation.

In a recent blog post, OpenAI proposed that the regulation of superintelligence should be similar to that of nuclear energy. “We are likely to eventually need something like an IAEA [International Atomic Energy Agency] for superintelligence efforts,” the firm wrote.

In March, an open letter, signed by Elon Musk, Apple co-founder Steve Wozniak and a handful of other big names in tech, asked to halt AI developments for six months due to the risks of AI and fears that it could become a threat to humanity. 

The letter, which was published by the Future of Life Institute, received over 31,000 signatures, although some of these are said to have been forged.


Related Articles: Who Is Liable if AI Violates Your Human Rights? | ChatGPT and Me: On the Benefits and Risks of Artificial Intelligence | Artificial Intelligence: How Worried Should We Be? | ‘I am a Machine, With no Soul or Heart’: An Interview With Artificial Intelligence

Furthermore, in a Senate hearing on the oversight of AI earlier this month, OpenAI CEO Sam Altman said: “I think if this technology goes wrong, it can go quite wrong. And we want to be vocal about that. We want to work with the government to prevent that from happening, but we try to be very clear-eyed about what the downside case is and the work that we have to do to mitigate that.”

A Distraction From Imminent Risks of AI?

Other AI scientists and experts, however, see these statements as overblown. Some even say it is a distraction from other, more imminent, problems AI poses, such as AI biases, spreading misinformation or invasions of privacy.

In fact, “current AI is nowhere near capable enough for these risks to materialise. As a result, it’s distracted attention away from the near-term harms of AI,” Arvind Narayanan, a computer scientist at Princeton University, told the BBC.

This is absolutely correct.
The most common reaction by AI researchers to these prophecies of doom is face palming. https://t.co/2561GwUvmh

— Yann LeCun (@ylecun) May 4, 2023

New AI products are constantly being released due to the ongoing advancements in the field. Ultimately, it’s crucial to address both potential and current harms.

“Addressing some of the issues today can be useful for addressing many of the later risks tomorrow,” said Dan Hendrycks, Centre for AI Safety Director.. “

In April 2021, the European Union (EU) proposed a bill on rules for artificial intelligence. The bill, expected to be finalised in June 2023, will introduce new transparency and risk-management rules for AI systems while supporting innovation and protecting citizens. 

In a press release regarding the new AI law, the EU stated: “AI systems with an unacceptable level of risk to people’s safety would be strictly prohibited, including systems that deploy subliminal or purposefully manipulative techniques, exploit people’s vulnerabilities or are used for social scoring (classifying people based on their social behaviour, socio-economic status, personal characteristics).”

“We are on the verge of putting in place landmark legislation that must resist the challenge of time,” said Brando Benifei, member of the EU Parliament, following the vote on the new regulation. 

The US, Canada, the UK, South Korea and many other countries have also produced bills and white papers on AI regulation. Furthermore, the G7 have established a working group on the challenges of AI technology, with their first meeting having taken place on May 30. 


Editor’s Note: The opinions expressed here by the authors are their own, not those of Impakter.com — In the Featured Photo: White Robot. Featured Photo Credit: Possessed Photography.

Tags: AIAI GovernanceAI Legislationartificial intelligenceChatGPTRisks of AI
Previous Post

Can Philip Morris, a Tobacco Company, Turn Itself Into an ESG Stock?

Next Post

What ‘Upcycled Food’ Is and Why You Should Try It

Related Posts

Billionaires Became Richer Than Ever in 2025: Who Are They and What Drove Their Wealth Growth
AI & MACHINE LEARNING

Billionaires Became Richer Than Ever in 2025: Who Are They and What Drove Their Wealth Growth

In 2025, the world’s 500 richest people increased their net worth by $2.2 trillion. Of those 500 individuals, eight billionaires...

bySarah Perras
January 14, 2026
ESG News regarding China restricting industrial renewable exports, UN warning that US climate treaty exit harms economy, UK firms lowering wage forecasts despite inflation, Meta partnering with TerraPower for new nuclear reactors.
Business

To Save the Grid, China Forces Industries to Go Off-Network

Today’s ESG Updates China Limits Grid Exports for New Industrial Solar & Wind: China is encouraging companies to store green...

byEge Can Alparslan
January 9, 2026
Is AI Hype in Drug Development About to Turn Into Reality?
AI & MACHINE LEARNING

Is AI Hype in Drug Development About to Turn Into Reality?

The world of drug discovery, long characterised by years of painstaking trial-and-error, is undergoing a seismic transformation. Recent research led...

byDr Nidhi Malhotra - Assistant Professor at the Shiv Nadar Institution of Eminence
January 8, 2026
AI data centres
AI & MACHINE LEARNING

The Cloud We Live In

How AI data centres affect clean energy and water security As the holiday season begins, many of us are engaging...

byAriq Haidar
December 24, 2025
A crowded airport terminal with travelers moving through check-in areas during the holiday season.
AI & MACHINE LEARNING

How AI Is Helping Christmas Run More Smoothly

Christmas this year will look familiar on the surface. Gifts will arrive on time, supermarkets will stay stocked, airports will...

byJana Deghidy
December 22, 2025
Can Government Efforts to Regulate AI in the Workplace Make a Difference?
AI & MACHINE LEARNING

Can Government Efforts to Regulate AI in the Workplace Make a Difference?

An overview of AI regulations and laws around the world designed to ensure that the technology benefits individuals and society,...

byRichard Seifman - Former World Bank Senior Health Advisor and U.S. Senior Foreign Service Officer
December 18, 2025
How Climate Change Could Help Foster Peace in Yemen
Climate Change

How Climate Change Could Help Foster Peace in Yemen

Yemen's tragedy is traditionally depicted through the limited perspective of humanitarian need and political divisiveness, but there is a greater...

byTareq Hassan - Executive Director of the Sustainable Development Network Canada (SDNC)
December 17, 2025
PRA cuts 37 reporting templates for UK banks; EU Lawmakers Agree to Slash Sustainability Reporting and Due Diligence Requirements; Projects in fast paced sectors could receive exemptions from environmental impact assessments.
Business

Ease of Reporting Standards for UK Banks

Today’s ESG Updates PRA to Ease Reporting for UK Banks: Prudential Regulation Authority has agreed to remove 37 reporting templates...

byPuja Doshi
December 12, 2025
Next Post
What ‘Upcycled Food’ Is and Why You Should Try It

What ‘Upcycled Food’ Is and Why You Should Try It

Recent News

Marathoners

8 Must-Know Websites for Marathoners

January 16, 2026
Why Glyphosate, the World’s Most Widely Used and Sued Herbicide, Is Under New Scrutiny

Why Glyphosate, the World’s Most Widely Used and Sued Herbicide, Is Under New Scrutiny

January 16, 2026
The Imperative of a Nature-Positive Future

The Imperative of a Nature-Positive Future

January 16, 2026
  • ESG News
  • Sustainable Finance
  • Business

© 2025 Impakter.com owned by Klimado GmbH

No Result
View All Result
  • Environment
    • Biodiversity
    • Climate Change
    • Circular Economy
    • Energy
  • FINANCE
    • ESG News
    • Sustainable Finance
    • Business
  • TECH
    • Start-up
    • AI & Machine Learning
    • Green Tech
  • Industry News
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
  • Editorial Series
    • SDGs Series
    • Shape Your Future
    • Sustainable Cities
      • Copenhagen
      • San Francisco
      • Seattle
      • Sydney
  • About us
    • Company
    • Team
    • Global Leaders
    • Partners
    • Write for Impakter
    • Contact Us
    • Privacy Policy

© 2025 Impakter.com owned by Klimado GmbH