On August 25, the European Union (EU)’s Digital Services Act (DSA) came into force for many of the largest tech companies operating in Europe. Initially, the DSA applies to Very Large Online Platforms (VLOPs) such as Facebook, LinkedIn, Twitter, Amazon and Apple’s AppStore and Very Large Online Search Engines (VLOSEs) Google and Bing with over forty-five million monthly users. Complying with the DSA means these companies now face “specific rules that tackle the particular risks such large services pose to Europeans and society when it comes to illegal content, and their impact on fundamental rights, public security, and wellbeing.”
The landmark piece of legislation, active across the EU’s twenty-seven member states, has been dubbed a new “constitution of the internet”. It’s an effort to shape the future of the online world with an understanding of the power large platforms have accrued—and an effort to ratchet up their obligations on aspects such as content moderation, transparency, accountability and tech design. US lawmakers will be watching the act’s implementation carefully.
Margrethe Vestager, Executive Vice President of the European Commission for A Europe Fit for the Digital Age and European Commissioner for Competition, has said her ultimate goal is to make technology “a tool for us to build better societies”, not “for us just to be a small data point, something to be exploited from a capitalistic point of view”. The cost of not changing the internet’s rules, Vestager added, would put Europe’s democracies at risk. Recently, I spoke to Suzanne Vergnolle, a professor of technology law at the National Conservatory of Arts and Crafts in Paris, about the DSA, its aims and criticisms, and whether it will usher in a new age of accountability for tech platforms globally. Our conversation has been edited for clarity and length.
JB: Can you start by explaining what the act is and what it hopes to achieve?
SV: The DSA aims at regulating digital services. It replaces the e-Commerce Directive that was adopted in the early 2000s. That was a piece of legislation that was inspired by Section 230 [of the 1996 Communications Decency Act] in the US, where there was the non-liability principle for platforms or intermediary services. Basically, at the time we had these services starting to exist, we wanted them to build more things and be comfortable creating new services and be as innovative as possible. To do that we had to create an environment of trust, which saw the non-liability principle—intermediary services are not liable for content that is published on their platform, except if they have the knowledge of illegal content and they don’t act to remove it. That’s where we started in the 2000s. From that point, a lot of services emerged and people trusted doing business online.
Now, we have hate speech, misinformation, child sexual material, terrorism content, a lot of illegal or problematic content being put on these platforms. What the EU is trying to do with this legislation is better regulate online services. Why the EU adopted these regulations—as is very often the case when the EU intervenes—is because member states actually adopted their own to regulate the problems. The EU wanted to uniformize this area of law.
You mentioned Section 230 of the Communications Decency Act in the US, which shields tech platforms from liability for user-generated content. But the DSA isn’t seeking to make platforms liable, instead to hold them more accountable when content breaks their own rules or is illegal. Can you explain that third way—that keeps liability as it is, but strengthens moderation in other ways?
The DSA, exactly as you explain it, preserves that non-liability principle that already existed back then, so the rules are pretty much exactly unchanged. But it really builds more accountable and responsible principles for platforms and online services. You have the same idea—non-liability—until you have the awareness of content. Then you need to act. So basically, you have these principles that stay as they were and you add other principles, including a compliance principle, that are different depending on the nature and size of your service. The bigger the service is, the more risk, and therefore the more rules they’ll have to follow. For Very Large Online Platforms, you have way more obligations than for all intermediary services. You then have basic principles that apply to every intermediary service—including transparency of terms of service, transparency reporting, way more compliance, including for instance transparency of online advertising, you have protection of minors [from targeted advertising], you have compliance by design, you have risk assessment and mitigation measures, and so on.
As you said earlier, one goal of the DSA is to harmonize different regulatory regimes across the EU’s twenty-seven member states, on things like content moderation, transparency, design, oversight. But what about further afield—what do you think the impact will be on regulation around the globe and in the US from the DSA?
Do you mean the influence of the DSA on other pieces of legislation?
I guess I mean two things. One, will other legislative bodies around the world see what’s being done in Europe and be influenced by that? And two, will the changing behavior of tech platforms be confined to Europe, or will we likely see changes in how they operate across different jurisdictions?
To answer your first question, [Columbia Law professor] Anu Bradford has been working on something called the Brussels Effect, the idea that the EU has a lot of [global] influence because it’s such a big market, because it’s a democracy, because legislative procedures are transparent, are well-designed, so other countries can follow how the law was built. It’s something you can rely on. Looking at the DSA, can this piece of legislation inspire other regulators and legislators? Absolutely. Because a lot of legislators are looking into reform of the regimes of the early 2000s. Everyone agrees platforms are not the same as twenty years ago. They present more risks, they influence campaign politics and other things, so lots of legislators are discussing reforming former regimes, including in the US. The DSA can be a source of inspiration for other legislators for all of these reasons.
To answer the second question, a very small number of platforms, when GDPR went into force [in 2018], decided to leave the EU market. I think for the DSA it’s going to be quite similar. Especially the bigger platforms, the ones that are used by general basic EU consumers, will comply with this and probably put in place measures that are making their platforms safer—and probably, I hope, they will extend to other parts of the world, as they did with the GDPR. I think it’s going to be platform by platform.
Another interesting thing is how the DSA will compel tech platforms to release data to independent researchers, watchdog organizations and so on to let us see under the hood of these companies. Can you say more about this and what it might mean for journalism as well?
Article 40, and its position regarding [granting data] access for researchers and CSOs [civil society organizations], is only one of many aspects of the role of civil society organizations. The DSA is a very interesting piece of legislation on so many levels. One level is, trying to move further from the bilateral relationship between regulator and platform, and to involve multiple third parties—researchers, trusted flaggers, national regulators, consumer organizations—a lot of other parties that can interact to help the implementation, monitoring and enforcement of the DSA.
Is journalism a part of that?
Yeah, I think investigative journalists will have a say, general journalists actually reporting on the DSA have a big role in the implementation. Because the more we talk about it the more pressure is put on platforms.
To stay on this theme, the DSA has not been without some criticisms, and you’ve made the case that the European Commission, the EU’s executive power, should involve more civil society groups in its enforcement of the DSA. Can you elaborate on that?
It’s a great opportunity for the regulator to think about ways of working with civil society. It’s going to be interesting to look at what the Commission is doing, because it really can’t do it alone. That’s why they created the European Centre for Algorithmic Transparency, are trying to hire more technological people, to work with researchers and so on. One of the ways to make that relationship more civil is by establishing an expert group, at the EU level, to help the Commission in the enforcement. And when I say enforcement, I mean also deciding who to investigate, deciding who to monitor, deciding who to spend time working on. Civil society organizations that have grounded, evidence-based knowledge that they can bring to the Commission. And inside that expert group, there should be CSOs and researchers and academics, who can help shape that dynamic. Of course, journalism is another aspect of the success of the DSA. Investigative journalists can work with researchers in trying to figure out how misinformation circulates, and these kinds of topics where they already do that work. Article 40, in that sense, is a unique opportunity to look more closely and have direct access to information [from platforms].
Another criticism of the DSA is, the definition of a ‘systemic risk’ is left up to the tech platforms themselves. Do you see this as a problem and can you say more about it?
We are in the very early stage in the DSA and its implementation. Of course, a lot of people including myself are worried about how these definitions are actually going to be interpreted by platforms, by regulators, by the Court of Justice, by tribunals, by the courts in general. ‘Systemic risk’ is one of the key issues, because it targets Very Large Online Platforms. Even though it’s very few of the all platforms that [will be] regulated by the DSA, it’s where a lot of people are getting their news and information. That’s why these few words that are dedicated to the biggest platforms are actually very important. There was public consultation on this topic, which I think closed a few weeks ago, and it’s also why there’s a distinct need for having these conversations with other parties other than just platforms themselves.
To zoom out, in your view will the DSA—alongside other legislation like the EU’s GDPR and the Digital Markets Act—have a significant impact on reducing harms? And is this, as some people have said, the end of the online ‘wild west’?
This expression was used by a politician. I’m not a politician, I’m a researcher. I think it’s a step forward to a safer environment online, that’s for sure. But as any other piece of legislation, it’s going to be only as good as its enforcement. As a researcher, I‘m curious to know who is actually being hired by the Commission to enforce this legislation. Is it litigation lawyers, is it more policy people? What are the next steps the Commission is taking? As I understand it, the people hired recently are not experts in procedural law, they’re more policy people. Which means, in my view, that the next few months are going to be about going hand-to-hand with platforms in doing the right thing, more than issuing big fines. And that’s fine, that’s not a problem per se, but it’s going to take more time than just in one second everything being safer.
My former Tow Center colleague Gabby Miller wrote recently for Tech Policy Press that it will likely take years before we can judge whether the DSA works and what areas need improvement. But are we likely to see, in the next year or so, significant changes to how platforms are operating?
I think yes, we will see improvements and we will see evolution. In 2024, there are hundreds of elections around the globe. Here, the DSA will be put in action. There are a lot of elections in Europe, for instance for the European Parliament, so it will be interesting to see how the DSA is operating. Maybe it’s going to be better than before, maybe it’s going to be worse, maybe it’s going to be less worse than if the DSA was not adopted. We cannot really know that. But what we will be able to know is—thanks to the DSA—is how the platforms are actually working on this topic because of all of the transparency obligations built into the DSA.