Impakter
  • Environment
    • Biodiversity
    • Climate Change
    • Circular Economy
    • Energy
  • FINANCE
    • ESG News
    • Sustainable Finance
    • Business
  • TECH
    • Start-up
    • AI & Machine Learning
    • Green Tech
  • Industry News
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
  • Editorial Series
    • SDGs Series
    • Shape Your Future
    • Sustainable Cities
      • Copenhagen
      • San Francisco
      • Seattle
      • Sydney
  • About us
    • Company
    • Team
    • Partners
    • Write for Impakter
    • Contact Us
    • Privacy Policy
No Result
View All Result
  • Environment
    • Biodiversity
    • Climate Change
    • Circular Economy
    • Energy
  • FINANCE
    • ESG News
    • Sustainable Finance
    • Business
  • TECH
    • Start-up
    • AI & Machine Learning
    • Green Tech
  • Industry News
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
  • Editorial Series
    • SDGs Series
    • Shape Your Future
    • Sustainable Cities
      • Copenhagen
      • San Francisco
      • Seattle
      • Sydney
  • About us
    • Company
    • Team
    • Partners
    • Write for Impakter
    • Contact Us
    • Privacy Policy
No Result
View All Result
Impakter
No Result
View All Result

Legal Loopholes Don’t Help Victims of Sexualised Deepfakes Abuse

Sexual deepfake abuse silences women causing lasting harm, and laws to protect them are inconsistent

byAnastasia Powell - Professor of Family & Sexual Violence in Criminology and Justice Studies at RMIT University,Asia Eaton - Professor of Psychology at Florida International Universityand2 others
April 2, 2024
in AI & MACHINE LEARNING, TECH
Sexualised Deepfakes

In early 2024, pop megastar Taylor Swift became the centre of a disturbing controversy. Millions of sexually explicit deepfake images of her flooded social media, raising concerns about the misuse of this Artificial Intelligence (AI) technology. Only after one image was viewed more than 47 million times, did social media platform, X (formerly Twitter), remove the content.

Swift’s case provided a wake-up call to how easy it is for people to take advantage of generative AI technology to create fake pornographic content without consent, leaving victims with few legal options and experiencing psychological, social, physical, economic, and existential trauma.

The trend began in 2017, when a Reddit user uploaded realistic, but entirely fabricated, sexual imagery of female celebrities superimposed onto the bodies of pornography actors.

Seven years on, nudify apps are readily accessible and advertised freely on people’s social media feeds, including Instagram and X. In Australia, a Google search of “free deepnude apps” brings up about 712,000 results.

A 2019 survey conducted across the UK, Australia and New Zealand found 14.1% of respondents aged between 16 and 84 had experienced someone creating, distributing or threatening to distribute a digitally altered image representing them in a sexualised way. People with disabilities, Indigenous Australians and LGBTQI+ respondents, as well as younger people between 16 and 29, were among the most victimised.

Sensity AI has been monitoring online sexualised deepfake video content since 2018 and has consistently found that around 90% of this non-consensual video content featured women.

What happened to Swift is sadly nothing new, as there have been numerous reports of sexualised deepfakes being created and shared involving women celebrities, young women and teenage girls.

Legal ambiguities

These digitally manipulated images pose significant ethical and legal challenges, prompting a reevaluation of existing laws and responses to such abuses.

This is complicated on platforms with encrypted content, such as WhatsApp, where deepfakes may be shared without fear of detection or moderation.

This has been recognised in a variety of forums, including the public campaigning of victim-survivors following deepfakes of Swift, comments from the Australian Federal communications minister and the US House of Representatives’ March 12 hearing on the harms of sexualised deepfake abuse.

Australia has led the way in criminalising image-based abuse and its harms, as well as providing alternative avenues, such as the image-based abuse victim reporting portal facilitated by the eSafety Commissioner, who also has legal powers to compel individuals, platforms and websites to remove sexualised deepfake content.

Except for the state of Tasmania, the distribution or threat to distribute sexualised deepfakes of an adult without their consent is captured under Australia’s existing image-based abuse laws.

However, the non-consensual production or creation of a sexualised deepfake of an adult is not specifically captured under Australian law, except in the state of Victoria.

Elsewhere in Australia, there is much ambiguity as to whether non-consensually creating or producing a sexualised deepfake of another adult is a crime, and whether possession of such non-consensual content is a crime.

It’s the same around the world.

In the UK, the Online Safety Act 2023 criminalises the non-consensual sharing or threat to non-consensually share a sexualised deepfake of an adult, but it does not include the production or creation of sexualised deepfake imagery.

In the US, there is currently no national law criminalising either the creation or distribution of sexualised deepfake imagery of an adult without consent. However, much like Australian states and territories, some states have criminalised the non-consensual distribution of sexualised deepfake imagery of adults, and at least three states — Hawaii, Louisiana and Texas — have amended laws to include the non-consensual creation of sexualised deepfake imagery.

A UK review ranked deepfakes as the most serious social and criminal threat using AI. With claims that open-source technology producing deepfakes “impossible to detect as fake” will soon be freely accessible for all, there is a need to improve legal and other responses.

A new Australian Research Council study seeks to do just this, exploring sexualised deepfake abuse, including the number of victims and perpetrators, the consequences, predictors and harms across Australia, the UK and the US, with a primary focus on improving responses, interventions and prevention.


Related Articles: Artificial Intelligence: How Worried Should We Be? | Who Is Liable if AI Violates Your Human Rights? | Can Artificial Intelligence Help Us Speak to Animals? | AI’s Threat to Humanity Rivals Pandemics and Nuclear War, Industry Leaders Warn

The ambiguity around the illegality of creating, producing and possessing non-consensual sexualised deepfake imagery of adults suggests that further legal change is required to provide more appropriate responses to sexualised deepfake abuse.

It may also go some way towards curbing the accessibility of sexualised deepfake technologies. If it is illegal to create or produce non-consensual deepfake imagery, then it would likely reduce the capacity for the technologies, like the nudify apps, to be advertised.

It is important that any new or amended laws are introduced alongside other responses which incorporate regulatory and corporate responsibility, education and prevention campaigns, training for those tasked with investigating and responding to sexualised deepfake abuse, and technical solutions that seek to disrupt and prevent the abuse.

Responsibility should also be placed onto technology developers, digital platforms and websites who host and/or create the tools to develop deepfake content to ensure safety by design and to put people above profit.

Outside of sexualised deepfake abuse, there is a pressing need for guidelines around the responsible creation of deepfake content — whether to avoid the spread of disinformation or to avoid gendered or racial bias, such was the case with the sexually altered image of Victorian MP Georgie Purcell.

Italian Prime Minister Giorgia Meloni is seeking €100,000 in damages from two men after deepfake pornographic images using her face were circulated online. According to Meloni’s lawyer, the money was “symbolic” and the demand for compensation was intended to empower women to not be afraid to press charges.

As identified elsewhere, a key problem with AI image manipulation is that these tools rely on the biased social norms and information that our human society has generated.

As the use of AI and digital image manipulation becomes more mainstream, such as the controversial British royal family edited photo, there should be ethical guidelines around how deepfake content is created, shared and discussed.

There are long-standing legal precedents in many countries for regulating deceitful, harmful expressions that others perceive as true. Guidelines and regulations pertaining to the ethical and responsible creation of deepfake content could sit within a similar framework.

Given the truly transnational nature of this challenge, it is important that global action, collaboration and responses are facilitated, which focus on preventing harm and ensuring responsible and ethical content development.

A global approach is vital if society truly wants to address and prevent the harms of sexualised deepfake abuse.

** **

This article was originally published by 360info™.


Editor’s Note: The opinions expressed here by the authors are their own, not those of Impakter.com — Cover Photo Credit: Freepik.

Tags: AIartificial intelligencedeepfake imagesDeepfakesDeepfakes AbuseGenerative AITaylor Swift
Previous Post

How Much Are European Companies Spending on Green Initiatives?

Next Post

Clean Investments in the US Hit Record High

Related Posts

Nasa picture of night on Earth with thousands of lights
AI & MACHINE LEARNING

AI’s Carbon Footprint Is Also a Geography Problem

March 13, 2026
Regulatory Updates from EU Nuclear stance, AI Copyright, UK Carbon Net Zero Buildings, and US Tariffs Refunds by Customs and Border Protection
AI & MACHINE LEARNING

The Tariff Refund Saga Unfolds in Court

March 13, 2026
Space‑Based Solar Power test
Energy

Solar Power From Space: How Close Is It to Reality?

March 11, 2026
Next Post
clean investmens US

Clean Investments in the US Hit Record High

Recent News

ESG news: Trump administration's new move to enforce tariffs; Apple's App Store commission cuts; Canada's job market; and Glencore refinery workers' strike.

60 Nations Caught in New US Trade Crosshairs

March 13, 2026
Nasa picture of night on Earth with thousands of lights

AI’s Carbon Footprint Is Also a Geography Problem

March 13, 2026

Impakter informs you through the ESG news site and empowers your business CSRD compliance and ESG compliance with its Klimado SaaS ESG assessment tool marketplace that can be found on: www.klimado.com

Registered Office Address

Klimado GmbH
Niddastrasse 63,

60329, Frankfurt am Main, Germany


IMPAKTER is a Klimado GmbH website

Impakter is a publication that is identified by the following International Standard Serial Number (ISSN) is the following 2515-9569 (Printed) and 2515-9577 (online – Website).


Office Hours - Monday to Friday

9.30am - 5.00pm CEST


Email

stories [at] impakter.com

By Audience

  • TECH
    • Start-up
    • AI & MACHINE LEARNING
    • Green Tech
  • ENVIRONMENT
    • Biodiversity
    • Energy
    • Circular Economy
    • Climate Change
  • INDUSTRY NEWS
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
    • Editorial Series

ESG/Finance Daily

  • ESG News
  • Sustainable Finance
  • Business

About Us

  • Team
  • Partners
  • Write for Impakter
  • Contact Us
  • Privacy Policy

© 2026 IMPAKTER. All rights reserved.

No Result
View All Result
  • Environment
    • Biodiversity
    • Climate Change
    • Circular Economy
    • Energy
  • FINANCE
    • ESG News
    • Sustainable Finance
    • Business
  • TECH
    • Start-up
    • AI & Machine Learning
    • Green Tech
  • Industry News
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
  • Editorial Series
    • SDGs Series
    • Shape Your Future
    • Sustainable Cities
      • Copenhagen
      • San Francisco
      • Seattle
      • Sydney
  • About us
    • Company
    • Team
    • Partners
    • Write for Impakter
    • Contact Us
    • Privacy Policy

© 2026 IMPAKTER. All rights reserved.