Europe is once again leading the charge against the international tech industry, whose monopoly-creating business practices have come under increased scrutiny in recent years. For over a year, the EU has been pushing two important acts through the legislative process, the Digital Services Act and the Digital Markets Act.
The Digital Services Act (DSA) intends to impose further measures safeguarding users, by imposing more stringent rules on content moderation, privacy and targeted advertising. It provides specific protections for children, and aims to create a culture of transparency with regards to big tech’s usage of user data and content removal.
The Digital Markets Act (DMA) focuses on creating opportunities for smaller services to participate in a digital market that has been dominated by a handful of the largest providers, who the legislation refers to as “gatekeepers.” It aims to inhibit the creation of monopolies, making space for the small and medium sized platforms accounting for 90% of the 10 000 platforms operating in the EU.
Both Acts attempt to harmonise the law between member states, in-keeping with the EU philosophy of shared regulations. They also take precautions to target larger platforms specifically, including those in the “Big Five” – Google (owned by Alphabet), Apple, Meta (formerly known as Facebook), Amazon and Microsoft.
The DSA directs attention to what it defines as “Very Large Online Platforms,” or VLOPS, platforms with over 45 million monthly EU users.
The DMA will focus on “Gatekeeper” platforms. Gatekeepers are defined as those who, along with having a user base of 45 million monthly EU users, also operate across 3 EU countries, and have a turnover of €6.5 billion.
The progress of the DSA has been at a slower pace than the DMA, which saw a plenary vote last week, in which the European Parliament voted to approve the legislation in its current form. The next step for the DMA is for it to be presented to the Council of the European Union, representing EU governments.
The DSA in comparison has only passed through a key committee vote, and is expected to go through a plenary vote in January of 2022. The difference in the speed at which the legislations have been progressing is perhaps indicative of the fact that support for regulating the market and business activity of tech giants does not always coincide with support for regulation of content and privacy.
Many commentators have praised the separation of the two regulatory aims, suggesting that packaging them together, as the American legislature has previously done, creates a convoluted and impractical legal framework.
MEP Christel Schaldemos, one of the main drivers behind the DSA proposal, described the legislation as “opening up the black box of algorithms.” The DSA will attempt to bring much sought after transparency to the web of algorithms behind popular platforms that attempt to tailor user experiences, which often have the effect of boosting extremist content, and fostering unhealthy habits in vulnerable people.
Importantly, Schaldemos referred to Frances Haugen in the above speech, the whistleblower exposing Facebook’s concerning inability to curb the ill-effects of its algorithms, allegedly prioritizing growth and money over public safety.
The documents leaked by Haugen provided insight into the way Facebook has been used to proliferate hate speech. In India for example, anti-Muslim rhetoric online has flourished, with violent consequences. Similarly inciteful content on Facebook has repeatedly been linked to violence across the world, fuelling genocide in Myanmar, and an insurrection in America, the notorious January 6 assault on the Capitol.
Europe has also found itself the victim of online disinformation, swamped with conspiratorial coronavirus news, part of state-backed campaigns from Russia and China attempting to secure their interests in the region.
The documents also revealed Facebook’s own awareness of the detrimental mental health effect of Instagram on teenage girls.
As some commentators argue, the focus on algorithm transparency might be a more practical way of dealing with problematic content, since it takes the focus away from murky questions of freedom of speech.
Instead, it refocuses on what the platforms themselves are doing to spotlight erroneous and harmful content that is more likely to garner engagement, tackling the overexposure of that content to vulnerable people.
Big tech lobbying for years has had a sizeable effect on EU policies, but have been unable to derail the DMA so far
EU efforts to regulate the impact and practices of big tech have long been met with aggressive lobbying, with rampant use of deceptive astroturfing techniques. This strategy involves the funding of smaller advocacy groups, who parrot talking points criticizing the legislation seeking to restrict the activity of tech giants, giving the false impression of grassroots opposition.
MEP Alexandra Geese called attention to the continued use of astroturfing last month, tweeting that a “a Google-funded, Washington DC headquartered body just sent out an e-mail with a big study to European lawmakers pretending to represent SMEs [small and medium enterprises] and claiming the #DMA will hurt SMEs.”
"The Connected Commerce Council, for example, is a Washington-based nonprofit that bills itself as a voice for small businesses. But it counts Amazon, Facebook and Google as “partners,”
— Alexandra Geese (@AlexandraGeese) November 15, 2021
The specific organisation Geese called out was the “Connected Commerce Council,” but many other organizations providing similar lobbying services exist, including Digital Europe. Geese has previously called out the consistent presence of Google representatives in Brussels, quoted in the New Statesman early this year as saying Big Tech “have so many staff that they can attend, basically, every single meeting happening in Brussels.”
Related articles: Apple vs. Israeli Surveillance Supplier: “Amoral 21st century mercenaries” | Facebook to create 10,000 jobs for Europeans to work on a “Metaverse”
The net effect of big tech’s presence in Brussels is the crowding out of civil society and real citizen influence, who compared to multinational tech companies are vastly underfunded and under-resourced. Ironically, European citizens are not afforded the same access to EU spheres of influence as American, Silicon-valley based platforms like Microsoft, Google and Amazon.
With regards to the DMA at least, with which Google and Amazon particularly seem to have taken issue, the influence of lobbying has seemed not to have taken the steam out of the legislature. The DMA received overwhelming support from the European Parliament, with 642 votes for and 8 against.
The effect of lobbying should still be paid attention to however, with regards to the DMA’s progress in the next stage, which involves negotiations between parliament and government bodies. As Geese is quoted as saying in the New Statesman, “often member-state governments, and especially their representatives in Brussels, are a lot more permeable to lobbying than parliament.”
This may be due partly to the large amount of leverage big tech companies hold over certain countries, who fear economic repercussions if big tech were to cease operations in that particular state.
The proposed legislation is important since laws originating in Europe have implications for the behaviour of tech giants worldwide
The EU’s focus this year on laws countering internet “pollution” coincides with legislature emerging in the UK, the online safety bill. This bill, similarly to the DSA, places emphasis on content moderation, specifically aiming to protect children from harmful material, including pornography and the glorification of self-harm and suicide.
A previous bill from the UK, focusing on children’s privacy, which came into law in September, demonstrates the influence of European legislation. The bill, dubbed the “Age Appropriate Design Code” also mirrored much of the DSA’s privacy protection philosophy.
The Age Appropriate Design Code requires that large online platforms either prove that their service is not used by children, or ensure that all users under 18 are treated with special care, limiting the data collected on them. The DSA similarly aims to restrict the use of targeted adverts for children, which is based on a profile compiled on each user depending on tracked internet behaviour.
The measures have had an effect on the practices of tech platforms globally, pushing them to adopt measures such as setting the maximum privacy protection available as the default option for child users. Although the laws only necessitated that platforms alter their behaviour in the UK, as Shira Ovide of the New York Times points out, “practically and philosophically that might have been bad choice.”
Refusing to update their practices in countries where the law did not apply would suggest that the core principles of big tech companies are guided by external pressures rather than a commitment to user safety.
The American legislature is particularly influenced by European laws, whose effect already reins in Big Tech’s behaviour, making it easier to codify laws that can feasibly be complied with, without putting extra pressure on Big Tech.
This suggests that the international tech world, and those concerned with the moderation of online user experiences worldwide should pay attention to what is occurring in Brussels. The progress of the DSA and DMA may indicate that the power of Big Tech’s lobbying is waning, leading to a culture of greater transparency and accountability.
Editor’s Note: The opinions expressed here by Impakter.com columnists are their own, not those of Impakter.com. — Featured Photo: The DSA AND DMA attempts to protect EU citizens from privacy violations and give them the autonomy to choose a wider range of services. Photo Credit: markusspiske