Impakter
  • Environment
    • Biodiversity
    • Climate Change
    • Circular Economy
    • Energy
  • FINANCE
    • ESG News
    • Sustainable Finance
    • Business
  • TECH
    • Start-up
    • AI & Machine Learning
    • Green Tech
  • Industry News
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
  • Editorial Series
    • SDGs Series
    • Shape Your Future
    • Sustainable Cities
      • Copenhagen
      • San Francisco
      • Seattle
      • Sydney
  • About us
    • Company
    • Team
    • Partners
    • Write for Impakter
    • Contact Us
    • Privacy Policy
No Result
View All Result
Impakter logo
No Result
View All Result
Why Government Involvement in Content Moderation Could Be Problematic

Why Government Involvement in Content Moderation Could Be Problematic

Meghan DawsonbyMeghan Dawson
February 26, 2020
in Tech
0

U.K. to Penalize Platforms for Content Violations

The U.K. government announced last week that they plan to enact legislature which will penalize social media platforms that do not remove or block harmful content on their sites. The U.K. elaborated that posts containing child abuse, cyber bullying, and terrorist propaganda will be under scrutiny.

The U.K.’s new regulations are targeting platforms with user-generated content like YouTube, Facebook, and Twitter. Details regarding penalties and exact guidelines for content have yet to be specified.

To some, these plans sound like a good approach to improving safety for social media users. To others, however, imposing such laws may be a slippery slope towards censorship.

With multiple billions of users posting every single day, social media giants like Facebook and Twitter have a colossal responsibility to moderate content and ensure that users aren’t subjected to spam, harassment, or obscene posts. The platforms use a combination of contractors and artificial intelligence algorithms to scour new posts for offensive material to remove. While illegal content such as child pornography is always deleted, other content can be legal but objectionable, posing a dilemma for moderators.

Moderators Face Difficult Judgement Calls

In 2016, criticisms surfaced when Facebook blocked the “Napalm Girl” photo for violating its child nudity policy. Although the historical photo does depict a naked nine-year-old girl, it is globally recognized as an important documentation of tragedy during the Vietnam War.

Meanwhile, in 2019 YouTube faced advertiser boycotts for neglecting to block  comments which were indecent but did not explicitly violate their community guidelines, and thus were not detected by their policing algorithms. The problem arose in the comment section of innocent videos of children’s gymnastics, in which pedophiles were swarming with sexually suggestive emojis and timestamps.

Not only were YouTube moderators overlooking the pedophilic nature of the comments, but their recommendation algorithm was facilitating the congregation of pedophile communities and monetizing them for profit. YouTube swiftly responded to backlash by disabling comments on millions of children’s videos and deleting over 400 channels that had left lewd comments on videos of children.

These two anecdotes are examples of the difficulty moderators face in judging if a post is acceptable or unacceptable. Once the U.K. government is involved in penalizing harmful content, they will have to dedicate expensive resources to evaluating problematic content and making innumerous judgement calls, some of which are likely to evoke public outcry.

Penalties Enable Monopolizing

Another concern is that the U.K.’s new regulations may be encouraging social media monopolies. Because most social platforms do not have the financial resources equal to that of Facebook, Twitter, or Google, penalties for smaller platforms may force them to shut down.

Nine out of ten smartphones use an operating system made by Apple or Google. Facebook has 2.5 billion monthly active users as of January 2020. This enormous success achieved by quasi-monopolies Facebook and Google can and often is perceived to threaten innovation in the tech market.

The last thing we as consumers want is to discourage small competitors by penalizing them for content violations. Budding competitors would struggle even more to compete, and would either be forced out of the market or bought by monopolies, perpetuating the vicious cycle.


Related Articles: Over-Sharenting on Social Media | Big Data Can Help Achieve Global Gender Equality

Fairness and Bias in Execution of Law

There is also concern that heavy content moderation can silence legitimate voices such as marginalized groups. In 2016, Facebook began collaborating with the Israeli government to tackle “incitement” on the platform.

This means that the Israeli government is now free to surveil and censor their Palestinian political opponents by restricting Palestinian assembly and silencing journalists.  Consequently, Israeli posts that contain extremist rhetoric are censored less, while extremist Palestinian posts are censored more.

We now know that Facebook can be compelled by governments to subvert minorities, which is bad news for those of us who would like to use these sites to communicate about social injustices or assemble a demonstration.

Government involvement in content moderation may mute some of our voices, while simultaneously amplifying voices of others. With over 72.6M followers, U.S. President Donald Trump has been the single most infamous politician on Twitter, tweeting about everything from political conspiracies to smearing his critics.

Although his tweets are constantly raising eyebrows, people were especially shocked to see Trump’s multiple tweets threatening North Korea with nuclear war. Though it is clearly stated in Twitter’s policies that users may not threaten violence against any groups of people, none of the tweets were ever blocked. This begs the question, does Trump get a pass because of his high status?

North Korean Leader Kim Jong Un just stated that the “Nuclear Button is on his desk at all times.” Will someone from his depleted and food starved regime please inform him that I too have a Nuclear Button, but it is a much bigger & more powerful one than his, and my Button works!

— Donald J. Trump (@realDonaldTrump) January 3, 2018

If the U.K. chooses to fine platforms, will they have the strength to stand up to celebrities or political leaders that break the rules? After all, Trump’s threats could be interpreted as the most egregious violation of Twitter’s policy in the platform’s history. If the government gets involved in moderation, citizens should expect that every post be evaluated equally despite the status of the poster.

Content Laws and Shifting Political Powers

Initially, the internet was envisaged to be a safe public sphere to discuss our opinions about news, religion, and politics. However, citizens of some countries do not have the luxury of enjoying access to unrestrained discourse.

For example, China is notorious for applying a strict and sophisticated surveillance apparatus which censors all networks within its reach, whether it be social media, news outlets, television, or even private messages. By doing so, the Chinese government reserves the right to oppress any publication which threatens its authority.

Like the U.K.’s new legislation, China’s online censorship began with laws written in parliament buildings. And like U.K.’s new regulation, China began their censorship mandates with vague definitions such as content which is deemed “harmful.”

How each country interprets these vague terms depends on the political climate at the given time. Critiquing the government can be interpreted as harmful to some leaders. While the U.K. ranks high on the democracy index today, we never know what kind of leadership could befall the country. By enacting laws which give the government control of our online voices, we could be giving authoritarians the tools to silence ourselves.

Consider This

Content moderation is essential to ensure a safe social media experience. Conversely, those who control our content have the potential to restrain our freedom to share ideas. The question is, do we trust our governments to regulate our online social discourse? Does the U.K., or any government for that matter, have the time, resources, or capacity to execute these policies consistently and fairly?


Editor’s Note: The opinions expressed here by Impakter.com columnists are their own, not those of Impakter.com — In the Featured Photo: Social Media Apps Photo Credit: Pixelkult
Tags: censorshipcontent moderationFacebookGoogletechTechnologyTwitterU.K.
Previous Post

Coronavirus Outbreak: The Real Political Risks

Next Post

Be A Lady They Said

Related Posts

If AI Steals Our Jobs, Who’ll Be Left to Buy Stuff?
AI & MACHINE LEARNING

If AI Steals Our Jobs, Who’ll Be Left to Buy Stuff?

The question is one of increasing urgency: What will workers do when technology does most of the work? In April...

byDr Manoj Pant - Former Vice-Chancellor of the Indian Institute of Foreign Trade & Visiting Professor at the Shiv Nadar Institution of Eminenceand1 others
February 18, 2026
ESG News regarding Nuclear Waste Storage; Canada Replaces EV Mandate; EU and Turkey Resume Trade Modernization Talks; Startup Raises $29M for Desk-Sized Fusion Reactor
Business

Volunteers Needed for Nuclear Waste Storage

Today’s ESG Updates: US Offers Incentives for Nuclear Waste Storage: The Department of Energy is proposing a "package deal" of...

byEge Can Alparslan
February 6, 2026
The Growing Role of AI in Business Decision-Making
Business

The Growing Role of AI in Business Decision-Making

When corporate executives arrive at Dubai on their flights, they make scores of decisions before their aircraft has a chance...

byHannah Fischer-Lauder
January 26, 2026
ESG News regarding Equinor wind farm project to resume in US, Blue Earth Capital raises $100 million, Google Signs Major 1.2 GW Carbon-Free Energy Deal, and US to Finalize 2026 Biofuel Quotas by March
Business

U.S. Court Clears Equinor to Resume $5B Wind Project Halted by Trump

Today’s ESG Updates Court Clears Equinor to Resume $5B Wind Project: A federal judge overturned Trump’s suspension order, allowing Equinor...

byEge Can Alparslan
January 16, 2026
Impakter’s Most-Read Stories of 2025
Society

Impakter’s Most-Read Stories of 2025

In 2025, as in previous years, Impakter readers turned in large numbers to stories examining climate change and pollution, environmental...

byImpakter Editorial Board
December 31, 2025
Trump media merges with fusion power startup
Business

Trump Media Merges With Nuclear Fusion Company

Today’s ESG Updates Trump Media Merges With Fusion Power Company: Trump Media & Technology Group announced an all-stock merger with...

byPuja Doshi
December 19, 2025
ESG News regarding increased grid stress slowing growth, US demanding exemption from EU emissions law, Google invests in solar in Malaysia, China reduces fossil fuel output
Business

Increased Grid Stress Threatens Economic Growth

Today’s ESG Updates Grid Bottlenecks Threaten Growth: Increased electricity demand from AI, EVs, and electrification is straining power grids and...

bySarah Perras
December 15, 2025
Discovery of a carbon sponge under the ocean; HSBC survey shines positive acceptance of climate transition; New catalyst for clean hydrogen production; Google signs deal with Ebb for carbon removal.
Business

Scientists Find CO2 Buildup Under the Sea

Today’s ESG Updates Eroded Lava Under the Ocean Stores CO2: Work led by the University of Southampton demonstrates that these...

byPuja Doshi
December 12, 2025
Next Post
Be A Lady They Said

Be A Lady They Said

Recent News

ESG news regarding Trump pausing global tariff increase, U.S. Supreme Court hearing oil companies’ appeal in Boulder climate lawsuit, Sam Altman defending AI energy use, and Endesa unveiling €10.6 billion plan to strengthen Spain’s power grids

Trump Reverses 15% Global Tariff Threat for EU and UK

February 24, 2026
A woman sending a PDF as a Fax From her Computer.

How to Send a PDF as a Fax From Your Computer Step by Step

February 24, 2026
Industrial Control Environments: Cybersecurity in action

Cybersecurity Risks Unique to Industrial Control Environments

February 24, 2026
  • ESG News
  • Sustainable Finance
  • Business

© 2025 Impakter.com owned by Klimado GmbH

No Result
View All Result
  • Environment
    • Biodiversity
    • Climate Change
    • Circular Economy
    • Energy
  • FINANCE
    • ESG News
    • Sustainable Finance
    • Business
  • TECH
    • Start-up
    • AI & Machine Learning
    • Green Tech
  • Industry News
    • Entertainment
    • Food and Agriculture
    • Health
    • Politics & Foreign Affairs
    • Philanthropy
    • Science
    • Sport
  • Editorial Series
    • SDGs Series
    • Shape Your Future
    • Sustainable Cities
      • Copenhagen
      • San Francisco
      • Seattle
      • Sydney
  • About us
    • Company
    • Team
    • Partners
    • Write for Impakter
    • Contact Us
    • Privacy Policy

© 2025 Impakter.com owned by Klimado GmbH