Back
Positioning & Advocacy
22.11.2022

Digital Services Act: can the DSA hold back consumer harm online?

Passed in July 2022 and expected to take effect next year, the Digital Services Act (DSA) and Digital Markets Act (DMA) mark a significant milestone in EU digital regulation. We consider its main features and compare it to efforts in other jurisdictions.

The Digital Services Act and the Digital Markets Act work in tandem on the overall objective of loosening the grip that the very big digital companies have on service provision – either in their role as powerful intermediaries and platforms or their holding of vast swathes of consumer data.

The DSA partly replaces outdated e-commerce rules from over 20 years ago with much stricter intermediary responsibilities for platforms dealing with harmful content, goods and services, while the DMA focuses on competition, and the market share (and data share) of large tech players.

This blog takes a closer look at the content of the DSA and considers how other jurisdictions are approaching similar issues. You can read our take on the DMA here.

Digital Services Act: new rules for platform intermediaries

The DSA sets out the first European wide set of rules for platform intermediaries’ obligations and accountability. As with its sister legislation the DMA, the DSA provisions start with a definition and criteria for large digital companies who facilitate so much of consumers and smaller traders’ interactions online.

A key difference is that whereas the DMA’s rules applied only to those meeting the definition of gatekeeper, the DSA will apply to all online intermediaries but will set more stringent requirements in proportion to the amount of users and type of service.

The ‘very large online platforms’ (or VLOPs for short) and ‘very large online search engines’ (or VLOSEs) are classed as those with more than 45 million monthly active users in the EU.

There are obligations for VLOPs and VLOSEs on hate speech and harmful or illegal content and disinformation. The DSA puts an obligation for VLOPs to analyse the “systemic risks they create”. The European Commission’s Code of Practice on Disinformation is one tool that, if adhered to, can be used to demonstrate that they are addressing such risks.

Digital Services Act: tackling consumer’s platform problems

The DSA covers a wide range of services provided by platforms, including e-commerce. Here, we focus on e-commerce and how the Act is responding to changes in online shopping. The DSA replaces e-Commerce rules dating from 2000 which were designed for a world of mostly direct B2C transactions.Today, a vast range of different types of online platform marketplaces act as intermediaries to link up third party sellers to consumers and also sell their own products on the site.

Whilst many purchase without problems, some consumer harms frequently found on online platform marketplaces include: purchasing unsafe and poor quality products, poor post-transaction support, spending money and time to get problems put right, exposure to counterfeit goods and insecure digital devices.

In such marketplaces to date, the intermediary platform has not been held liable for issues with third party sellers and products, although they do offer some services to rectify problems. Third party traders can also set up with little oversight or information provided to consumers.

Consumers are also exposed to manipulative choice editing such as self-promotion of the platform’s own brand products, fake reviews and scams, all of which are being dealt with in the new legislation

How the DSA will tackle consumer harms in very large online platforms

To close the responsibility gap, the DSA imposes a “duty of care” on VLOPs regarding third party sellers who sell products or services on their platforms – this included specialist e-commerce platforms such as eBay but can also encompass platforms with a broader remit for example social networks.

Regardless of the type of platform, the new duty of care puts the onus squarely on the intermediaries’ shoulders. Obligations include:

  • much stricter information and monitoring requirements
  • swift removal of illegal products and services from sale
  • marketplace platforms must ask for and verify traders’ information
  • random checks on legality of products against official databases.

Children receive additional protections too, with platforms being required to put special protection measures in place to ensure their safety online in particular when they are aware that a user is a minor. Platforms will be prohibited from presenting targeted advertising based on the use of minors’ personal data.

The algorithms that VLOPs use on their sites will also be subject to much greater scrutiny by regulators at the EU and Members State level. These will include the algorithms used in recommender systems, which use data-based profiling to determine what consumers see. Consumers will also be able to choose the option of seeing recommendations that are not based on profiling.

The DSA also includes a ban on the use of dark patterns on platforms (design practices that trick or manipulate users into making choices they may not otherwise have made).

Enforcing the Digital Services Act

National authorities will supervise smaller platforms, and the Commission will have exclusive competencies on very large online platforms. It looks like the Commission is taking the lack of enforcement resources seriously, with the EU executive charging platforms a supervisory fee to finance it.

Penalties for breaches of the DSA can scale up to 6% of global annual turnover, and consumers have been given new rights to seek compensation for damages caused by a non-complying platform, for instance, because it did not make the best efforts to verify the trader’s identity.

However, based on the DSA alone consumers will not have the right to be compensated by the marketplace if they suffer damages from an unsafe product bought via the platform – something that consumer groups called for as part of a wider liability regime for online marketplaces.

Platform intermediary rules in other jurisdictions

USA

The US does not yet have plans for the type of ‘notice and takedown’ requirements common to other regimes tackling general and consumer harms online. The USA’s Section 230 of the Communications Decency Act of 1996 removes liability from online platforms and ISPs from the content that any users or third-party may generate.

Designed originally to protect freedom of expression and not put platforms in the position of publisher or editor, it has evolved to be a principle of broad immunity from liability for any user-generated content or activity.

Given its roots in free expression and the reluctance of US-based platforms to step into a position where they have liability, proposals to regulate problems with consumer online marketplaces are challenging.

UK

The UK’s long awaited Online Safety Bill provides an interesting comparison with the DSA, particularly in terms of how consumer advocates have so far used the concept of ‘online harms’ to encompass much more than just hate speech or harmful content.

The EU and UK both base obligations on the size, reach and impact of the intermediary platform in question. Instead of the VLOPs and VLOSEs, the UK have opted for simpler naming of “Catergory 1” which refers to those considered to be “high-risk and high-reach”. Category 2a and 2b are reserved for those with less potential impact and usage level.

Both the EU and UK also focus on regulating systems and processes as opposed to imposing rules on individual pieces of content (as the Australian regime does) to be removed quickly once they are up (ex-post obligations).  In contrast, the ‘ex-ante’ approach of UK and EU means that the platforms do not just play a monitor and take down role, but also must assess and mitigate the risks of certain types of harmful content being present on the platform.

In terms of what is covered, consumer groups in the UK have successfully campaigned for online platforms to proactively tackle online scams. Under new duties introduced in a revised bill in March 2022, platforms in Category 1 and Category 2a must “take action to minimise the likelihood of fraudulent adverts being published on their service.

Failing to do so will result in the possibility of enforcement action, fines of up to £18m or 10% of global revenue and in the most serious cases, being blocked.

Consumer groups have had less success with their calls for the inclusion of product safety monitoring and responsibility. The Online Safety Bill does not include requirements on tracing third party traders on a platform.

This is covered to an extent in the DSA with VLOPs obliged to perform due diligence on products on sale via traders on platforms, to check they comply with EU product safety labeling. The DSA also requires random checks by platforms to scan for illegal and to inform consumers if they find they have bought such a product.

However, the slow progress of the UK’s bill and multiple revisions since 2019 have frustrated campaigners. More recent changes in national leadership have again left it on hold and there is no guarantee that all of the obligations will make it into law.

Expectations of Digital Services Act are high

The joint ambition of the DSA and its sister DMA is to tackle some of the challenges that the large platform model has thrown up. At the same time, the EU wants to maintain and grow vibrant, convenient digital markets that can service large amounts of people at speed.

A broad coalition of civil society, consumer groups including Euroconsumers, and new market entrants will be keeping a close watch on whether the larger platforms play by the rules. And, as with the DMA, the expected sea change in how responsibility for content and products on platforms is managed will only come about with well resourced, strong and bold enforcement.

We repeat the call, alongside consumer groups from all over Europe to equip the European Commission with the right level of expertise and enforcement resources to make fair digital markets and services a reality.