Blog

The Digital Services Act Unwrapped: Initial Implications for Platforms, Brands and Consumers

  • Brand Protection
The Digital Services Act Unwrapped: Initial Implications for Platforms, Brands and Consumers

Brand owners agree that, while more can and should be done, the Digital Services Act will ‘level-up’ the playing field between brands and major platforms. Read detailed analysis of the package of measures and their implications by Corsearch Senior Legal Counsel Mike Sweeney.

Introduction

The European Commission has published its much-anticipated Digital Services Act package of measures in what may amount potentially to the most sweeping reforms to tech regulation in Europe for more than two decades. 

Two separate pieces of legislation (the Digital Services Act and the Digital Markets Act) published on 15 December 2020 appear to signify the arrival of root and branch reform to the Internet platform economy, assuming the draft texts are adopted by the European Parliament and are enacted into law. 

Executive summary

With these developments, the European Commission is sending in the Cavalry, briefed with orders to assist brands and consumers, long since fatigued by their battle with Big Tech. If the draft legislation is enacted into law, brand owners can derive comfort from knowing that major Internet platforms – hitherto viewed through a lens of suspicion and regarded by many as presenting a threat to revenue, reputation and consumer trust – are being asked to do much more and will face substantial, financial penalties if they do not. 

Streamlined, efficient ‘notice and action’ procedures, enhanced (and extensive) seller authentication and due diligence requirements and greatly enhanced transparency requirements aimed at holding platforms to account for the steps taken to remove illegal content are just some of the improvements to the digital economy which brands can now look forward to. 

More generally, there is consensus among brand owners that, while more can and should be done (for example in relation to the issue of repeat infringers), these developments, if enacted into law, will operate to ‘level-up’ the playing field between brands and major Internet platforms, which for the best part of 20 years has been weighed much too heavily in favour of the latter.

For brand owners who entrust their brand protection to external partners, the benefits are plain:

  • enhanced seller authentication equates logically to more efficient detection and enforcement and is especially beneficial for brands wishing to deploy network analysis technology in order to disrupt and neutralize enterprise scale networks of infringers which present the most potent threat;
  • the provisions enabling ‘Trusted Flaggers’ to benefit from expedited notice sending and prioritization will operate logically to drive up efficiencies in terms of both volume of enforcements and favorable customer outcomes; and
  • the provisions around repeat infringers anticipate an assessment of multiple ‘factors’ (including the number of ‘items’ of illegal content, the gravity of infringements and their consequences and the intention of the seller).  It follows that brand protection professionals – with the resources and capability to store, sort and filter these data points centrally – will be well placed to undertake that assessment and tackle repeat infringers.

Background

Internet platforms in Europe have flourished as a result of relatively light regulation in the twenty or so years since the E-Commerce Directive[1] became law. In that time, technology has evolved to the point of bearing scant resemblance to the Internet of 20 years ago. Inter-connected social media, online marketplaces, app-based retail and 3D printing platforms are just some of the ways in which technology has morphed rapidly to form collectively what consumers know and love as today’s world wide web. 

While the technology has shapeshifted spawning some of the world’s most powerful and influential companies such as Amazon, Alibaba and Apple, the law has stood largely still. Meanwhile, bad actors continue to carve out ever more creative opportunities, taking advantage of developments in technology and the light-touch regulation of major platforms to undermine brands and erode consumer trust.  

With the arrival of Digital Services Act package of measures, that may be about to change once and for all, with some commentators going as far as saying that the package represents the best opportunity in 20 years for brands to tackle the scourge of online counterfeiting. European Commission President Ursula Von Der Leyen has sought to deliver on her pledge to make Europe “fit for the digital age”, affording consumers the same levels of protection online as offline. 

The detail and key provisions

The draft legislation comprises two discrete (substantial) instruments: the Digital Markets Act and the Digital Services Act. Note that while the legislation applies inside the Single Market, it will also apply to online intermediaries established outside of the European Union which offer services to consumers inside the Single Market (including Alibaba, Shopee and other major players in the Asia-Pacific region). The UK Government is also expected to announce parallel legislation, in order to afford comparable protection to UK consumers, following the UK’s formal withdrawal from the European Union. 

Taking each in turn – Digital Services Act (the “DSA”)

The DSA is intended (broadly) to regulate the way in which platforms handle illegal or harmful content in their capacity as intermediaries which connect consumers with goods, services and content. It is harmonized across the European Union and is directly applicable. 

For the avoidance of doubt, it does not replace the E-Commerce Directive or its national implementations which remain in place in Member States. A central theme to the DSA is the notion that technology companies will need now to take much more responsibility for unlawful behavior on their platforms – and will face serious financial penalties if they do not. While the detail of the DSA – and its potential impact – is still to be evaluated in full, first impressions are that, consistent with similar legislative developments in other jurisdictions, brand owners cautiously welcome the legislation. 

Legislative developments which favor enhanced platform accountability for the benefit of brands and their consumers are welcomed. The question, as noted below, is whether what is proposed goes far enough and/or whether there is more which can and should be done.

The main points to note from the DSA are as follows:

  • 1. Notice and Action Mechanisms: under the E-Commerce Directive, online platforms attract liability if they have knowledge about unlawful activity on their platform – for example the presence of listings which offer for sale counterfeit goods – and fail to act.  Knowledge is typically established through a brand owner (or its agent) issuing a ‘take-down’ notice.  Where a platform removes listings in these circumstances, it will be protected from liability.  This regime has ensured that platforms are only ever being reactive and provides a legal incentive for platforms to do very little in terms of proactively moderating their content.  The DSA builds on this by imposing additional obligations in platforms.  In particular, platforms will need to implement notice and takedown procedures for “information which the individual or entity considers to be illegal content” which should be “user friendly, easy to access and allow for the submission of notices exclusively by electronic means” (Article 14).  Note that the DSA does not elaborate on national or EU laws specifying what amounts to “illegal content”.
  • 2. Challenges to Platform Content Moderation Decisions and Transparency: where a platform, in response to being notified, elects to remove content (for example listings offering for sale counterfeit goods), the platform must inform the seller and provide a clear and specific statement of reasons in support of the decision, including the facts and circumstances relied on, a reference to the legal ground relied on and information on the redress possibilities open to the seller, including through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress (Articles 15, 17 and 18). Platforms must also publish at least once a year detailed reports on their activities relating to the removal of illegal content which include the number of disputes referred for out-of-court settlement, their outcomes and the average length of time taken to reach resolution (Article 23). These additional obligations are potentially onerous for platforms and require a much higher level of engagement with enforced sellers. It follows that they will also operate as an incentive for platforms to conduct seller due diligence to the fullest extent (see paragraph 4 below) in order to minimize the time and effort involved in engaging with an enforced seller.
  • 3. Trusted Flaggers: notices submitted by “trusted flaggers” are processed and adjudicated as a priority and without delay (Article 19). Trusted Flagger status is conferred (on application) upon entities established in Member States which:
    • 1. can demonstrate particular expertise and competence for the purpose of detecting, identifying and notifying illegal content;
    • 2. represent collective interests (and are independent from online platforms); and
    • 3. carry out activities for the purposes of submitting notices in a timely, diligent and objective manner.

This provision is clearly welcome news for brand owners who work with brand protection professionals and who, as a result, will enjoy priority and expedited notice processing. It remains to be seen whether the Commission might be prepared to extend “Trusted Flagger” status to individual brands (as many brands would wish), rather than exclusively to collective organizations.

  • Repeat Infringers: platforms are required to suspend for “a reasonable period of time” and after having issued a prior warning, the provision of their services to recipients that frequently (repeatedly) provide manifestly illegal content (Article 20). No guidance is given on what amounts to a “reasonable period of time” meaning that platforms are free to determine this according to their own commercial interests and interpretation.
  • Seller Verification and Traceability: platforms must collect a broad range of seller authentication data prior to allowing a trader to offer goods and/or services including names, addresses, telephone numbers, email addresses, copies of identity documents, bank account details, business registration details and written confirmation from the seller that it will only offer goods and/or services in accordance with applicable law (Article 22). Platforms must store this data securely for the duration of their contractual relationship with the seller (after which it must be deleted). 

This data and overall “Know-Your-Business Customer” approach is intended to act as a deterrent against bad actors and will be critical for brands wishing reliably to trace sellers and escalate offline action. These obligations are also consistent with proposed obligations in other jurisdictions, in particular the Shop Safe Act[2] and INFORM Consumer Act[3], both of which anticipate enhancedseller authentication practices, especially where there is a health and safety risk to consumers.

  • Very Large Platforms: additional obligations apply to platforms deemed to be “very large” defined as ones (such as Amazon and eBay) which provide services to 45 million (or more) monthly users (Article 25). These obligations include:
    • identifying “significant systemic risks” stemming from the use of their services;
    • implementing content moderation systems; and
    • participating in annual audits, including by giving access to data (on request) necessary to monitor and assess compliance with the DSA (Article 31) and by designating a compliance officer.
  • Financial Penalties: platforms found to be in breach of DSA obligations may face financial penalties of up to 6% of annual turnover as well as “periodic penalty payments” of up to 5% of average daily turnover for ongoing infringements. These financial penalties could, for very large platforms, run to tens of billions of Euros, giving an indication of the seriousness which the European Commission attaches to compliance.
  • European Board for Digital Services: the DSA establishes this independent advisory body of Digital Services Coordinators with the purpose of supervising providers of intermediary services (Article 47).

Digital Markets Act (the “DMA”)

In contrast to the DSA (which applies horizontally), the DMA applies to large online platforms which act as “Gatekeepers” in digital markets. The DMA seeks to regulate their behavior to ensure that the markets in which they operate remain fair and competitive. The DMA imposes restrictions and obligations to ensure that Gatekeeper platforms behave in a fair way online.  

Gatekeepers are defined (Article 3) as platforms which provide “core platform services” by reference to whether they:

  • have a significant impact on the internal market;
  • operate a core platform service which serves as an important gateway for business users to reach end users; and
  • enjoy an entrenched and durable position in their operations (or it is foreseeable that they will enjoy such a position in the near future).

The DMA reflects concerns among law makers around the dominance of ‘Big Tech’. EU Commissioners Margrethe Vestager and Thierry Breton, both of whom are spearheading the new legislation, have been vocal in their criticisms of Big Tech, commenting that “the commercial and political interests of a handful of companies should not dictate our future”.  

Under the DMA, platforms could be facing the prospect of enormous fines (up to 10% of annual turnover (Article 26) unless they comply with the new rules. As a result, very large platforms are likely to be vocal in lobbying against the “Gatekeeper” designation as the DMA proposals progress through the European Parliament.

Comments and next steps

The Digital Services Act package of measures reflects growing legal and political momentum in Europe and around the world towards enhanced online platform accountability. With these developments the European Commission is seeking to become the global leader on tech regulation, in much the same way it has with data privacy through the GDPR[4]. While brand owners will cautiously welcome the proposals, it is clear that more can and should be done to protect them and their consumers.

In particular:

  • the burden on policing and enforcing intellectual property rights on the Internet continues predominantly to rest with rights holders in light of the reactive nature of the notice and action mechanisms. Many brands would have preferred the Commission to have gone further by imposing proactive monitoring and enforcement obligations on platforms, backed by the threat of sanctions for non-compliance, in order to alleviate that burden.
  • whilst it is encouraging that the Commission has sought to address the problematic issue of repeat infringers on online marketplaces, it has not done so in sufficient detail. The requirement to suspend repeat infringers “for a reasonable period of time” is vague. The Commission has also stopped short of imposing a firm “staydown” obligation in respect of identical or equivalent infringing content identified following enforcement, which many brands believe not only chimes neatly with the effective notice and action procedures imposed by Article 14 DSA but also forms a critical component part in order for those procedures to operate effectively.

The DSA and DMA are still a long way off being enacted into law. Amendments are currently being tabled by the Internal Market Committee with a vote expected to take place in the European Parliament in December 2021.

In addition, Big Tech is expected robustly to lobby against the proposals. Karan Bhatia, Google’s Vice President of Government Affairs and Public Policy, has said that Google is “concerned that [the rules] appear specifically to target a handful of companies and make it harder to develop new products to support small businesses in Europe”[5] giving a sense of where the focus of Big Tech is expected to lie.

The proposals also come at a crucial time in the transatlantic relationship, with the Biden administration now being in post. It will be interesting to see how the US reacts to the EU seeking to regulate some of its most powerful companies to this extent.

Download “Three Strikes and Out”

How e-commerce platforms can protect consumers from repeat offenders

Our white paper provides brands, e-commerce platforms and legislators with data on the proportion of repeat infringements undertaken by the same sellers, who use key global online marketplaces and social media platforms to infringe intellectual property.

Learn how platforms can protect brands and consumers can by implementing strong seller verification and a ‘three-strikes-and-out’ policy.


References

[1] 2000/31/EC

[2] Stopping Harmful Offenders on Platforms by Screening Against Fakes in Ecommerce Act

[3] INFORM Consumers Act

[4] General Data Protection Regulation 2016/679 (implemented in the European Union on 25 May 2018).

[5] Instant View: U.S. tech firms face new EU rules for business practices (Reuteurs, Dec 2020): https://www.reuters.com/article/us-eu-tech-rules-instant-view-idINKBN28P2CQ