Q&A: Digital Services Act: compliance and enforcement

March 2024  |  SPECIAL REPORT: DATA PRIVACY & CYBER SECURITY

Financier Worldwide Magazine

March 2024 Issue


FW discusses compliance and enforcement under the Digital Services Act with Ana Jankov and Daniel Nunn at FTI Consulting, and Benjamin Docquir at Osborne Clarke.

FW: Could you outline the main drivers behind the European Union’s (EU’s) Digital Services Act (DSA)? What activities fall under its scope?

Nunn: The underlying objective of the Digital Services Act (DSA) is to give consumers the ability to trust that the information they read online is lawful, appropriate to the user and credible, and has not been created or doctored with the intent to cause harm or otherwise bias their opinion. The DSA aims to protect consumers from illegal content, disinformation, misinformation and other forms of conditioning. It also ensures that platforms show consumers appropriate content, and that they are transparent about advertising and restricted content decisions – with penalties for platforms that do not comply. These rules will provide obvious consumer benefits surrounding elections and will reduce consumer harm through platform accountability. While similar guidelines and regulations have existed in the past, the DSA provides a newer, clearer set of rules to abide by. This may reduce interventions between the European Union (EU) and large technology organisations over the US Sherman Antitrust Act and other legislative concerns.

Docquir: The new digital age has brought many benefits to our lives, across communication, culture, exchange and trade of goods and services content, among others. The EU has adopted the DSA to ensure that whatever is illegal offline is also illegal online and that digital services cannot be misused. Furthermore, the EU wants to ensure the security of online users and is prioritising the protection of fundamental rights while maintaining a fair and open environment for online platforms. The DSA has introduced rules for intermediary services that provide services to EU recipients. The DSA also has a wide scope of application and will apply, for example, to internet access providers, hosting services such as cloud computing and web-hosting services, domain name registrars, online marketplaces, app stores, collaborative economy platforms, social networks, content-sharing platforms, and online travel and accommodation platforms, to name a few of the entities covered.

Jankov: The landmark DSA can be seen as a step toward harmonising efforts to tackle illegal activity and misinformation online by providing assurance and rights to users as the internet evolves. Under the DSA, organisations will need to enhance their scrutiny, as tackling illegal content is often a complex, multifaceted aspect of risk management. Specific actions organisations may need to take under the law include operationalising the ability to receive illegal content alerts, identify actual illegal content, understand who is behind it, remove illegal content and cooperate with authorities. Digital providers offering services in the EU, whether they are established in the EU or elsewhere, will be required to follow the new rules on content moderation.

The landmark DSA can be seen as a step toward harmonising efforts to tackle illegal activity and misinformation online by providing assurance and rights to users as the internet evolves.
— Ana Jankov

FW: To what extent do DSA requirements and restrictions differ depending on the nature of the digital service being provided?

Docquir: The DSA makes a distinction between types of intermediary service providers. For example, intermediary service providers that do not host providers or online platforms have fewer compliance requirements. However, additional requirements must be fulfilled if an intermediary service provider would qualify as a hosting provider or online platform. Providers of hosting services, including online platforms, need to consider additional requirements, such as notice and action mechanisms, provide a statement of reasons if illegal or forbidden content has been removed, and notify the judicial authorities of suspicious activity or criminal offences. Online platforms need to put in place an internal complaint handling system, where an external, out-of-court, certified body has the authority to settle a dispute. In addition, they must ensure that when a so-called ‘trusted flagger’ notifies the platform of illegal content, these notifications are given priority and handled without delay. They must suspend services to recipients who frequently upload illegal content. Lastly, additional transparency reporting obligations apply, such as providing an online interface design that is not misleading, clear and concise advertising, and making sure that when minors visit their online platform, appropriate and proportionate measures are in place to ensure a high level of privacy, safety and security. Online platforms that allow consumers to conclude distance contracts have additional information requirements. They must obtain additional information from traders and design and organise contracts to enable traders to comply with their obligations under applicable law, such as pre-contractual information and product safety. Further, these online platforms must provide information to consumers, for example when they purchase illegal products or services via their platform. Providers of very large online platforms (VLOPs) and of very large online search engines (VLOSEs) need to consider additional risk assessment obligations and take proportionate and effective measures to mitigate those risks. Next to these obligations, they must undertake an annual independent audit to assess their compliance. These platforms also have additional transparency obligations when they provide advertisements.

Jankov: Although the original policy rationale of the DSA was to regulate Big Tech companies relating to their ownership and dissemination of vast amounts of data and significant network effects, the legislation evolved to encompass a wider range of digital services. In the final iteration of the law, size is not necessarily a decisive factor of applicability. This regulation applies to providers of a variety of digital services which fall under the definition of ‘intermediary services’. All intermediary service providers are bound by a basic set of obligations. First, establishing a single point of contact for communication with supervisory authorities, as well as a single point of contact for communication with service recipients. Second, establishing a legal representative in the EU for providers of intermediary services which do not have an establishment in the EU. Third, describing in the provider’s terms and conditions any content-related restrictions, including information on content moderation policies, procedures, measures and tools. Lastly, publishing an annual report on content moderation, including information such as the number of orders received from the authorities and describing any content moderation carried out on the providers’ own initiative. The DSA then gradually adds cumulative obligations for each tier of intermediary services.

Nunn: The European Commission (EC) has deployed the DSA asymmetrically to make costs and commitments proportionate to the services being provided, using a tiered system defined by eligibility criteria. This means that the larger a business is, the more legislation it must comply with. Very small platforms will be exempt from most DSA obligations. There are also nuances between the nature of the service provider, their obligations and, subsequently, their liability to police the information being distributed. Broadly, requirements are applied in greater depth to organisations with the greatest reach of information or content, with most focus on large online platforms that publish data and ultimately the largest search engines that may provide easy access to potentially harmful content. This system will make the largest online providers subject to the highest levels of scrutiny, and arguably more ‘trustworthy’ than small platforms trying to compete. This forms an interesting juxtaposition with the Digital Markets Act, which is trying to level that field.

FW: What obligations does the DSA place on online platforms and search engines? What does this mean from a compliance perspective and how companies need to respond?

Nunn: Information providers that offer intermediary services or search engines must comply with the DSA if they operate within the EU and have a significant number of consumers using, or being targeted by, their services. These parameters are similar to those established by the General Data Protection Regulation (GDPR), but with equal obligation for intermediaries in non-EU locations. Many companies needing to comply with the DSA will be outside the EU. As an example, UK companies providing intermediary services in Europe will almost certainly need to comply. But these companies will also fall under the UK’s new Online Services Bill, meaning dual-compliance and significantly more cost and complexity to stay on the right side of the law. Understanding all of the regulatory obligations that an organisation must comply with – and how the DSA fits into that – is a critical starting point. Also typical of modern, complex regulation is the lack of clarity surrounding some requirements. Defining what the ‘targeting of activities’ in EU jurisdictions may involve is one such example of an area which may be more opaque than either the regulator or regulated are comfortable with.

Jankov: Given that online platforms and large search engines are classified in the highest tiers of DSA obligations, they will have the most stringent set of requirements to fulfil. Those additional obligations include more detailed transparency reporting obligations and putting in place a notice mechanism to allow individuals and entities to flag potentially illegal content. Wide-ranging risk assessments to identify potential systemic risks, such as negative effects on fundamental rights, including risks stemming from algorithmic systems, and independent annual audits, are additional obligations of VLOPs and VLOSEs. Assessing progress along the DSA implementation journey or how it applies to the business is the starting point.

Docquir: Several additional requirements are applicable depending on how a company qualifies. This means that online platforms and search engines must consider the more general obligations of the DSA, in addition to the specific online platform rules, to be compliant. Generally, reviewing client interfaces and certain contractual documentation will be required. In some situations, compliance with these obligations will require online platforms and search engines to rethink the design of their platform, a process that some of them have already started but is certainly challenging. For example, a specific requirement for online platforms includes refraining from incorporating dark patterns into their online interfaces. Platforms can address this by designing and organising their online interface in a way that would not deceive or manipulate their service recipients or materially distort or impair their ability to make free and informed decisions.

Defining what the ‘targeting of activities’ in EU jurisdictions may involve is one such example of an area which may be more opaque than either the regulator or regulated are comfortable with.
— Daniel Nunn

FW: Could you explain how the DSA’s obligations and liability rules apply to intermediary services? What exemptions from liability or ‘safe harbours’ exist under the legislation?

Jankov: The DSA maintains the basic liability rules laid down in the e-Commerce Directive. Mere conduit, cashing and hosting services are subject to different liability exemptions. For example, hosting services are not liable for the content hosted on their service as long as they either do not know the content is illegal, or they promptly remove or block access to that content once they become aware that it is illegal. Liability exemption will, however, not apply to online marketplaces if an item or information is presented in a way that would lead consumers to believe that the online marketplace itself is providing those services or products. The DSA clarifies that intermediary service providers will not lose the benefit of the liability exemption if they carry out their own monitoring aimed at detecting illegal content. However, if a provider identifies illegal content in this way, it must promptly remove the content in order to benefit from liability exemption.

Docquir: Intermediary service providers are, in principle, not liable if they are information society services that transmit a communication network of information provided by a service recipient or provide access to a communication network, provided that they do not initiate the transmission, select the transmission recipient or select or modify the transmitted information. They are also not liable if they transmit information provided by a service recipient involving automatic, intermediate and temporary storage of that information, performed for the sole purpose of improving the efficiency or security of the information’s onward transmission to other service recipients at their request, provided that they do not modify the information, comply with information access conditions, update the information, do not interfere with the lawful use of technology to obtain data on the use of information, and remove or disable access to information if the initial source is removed, access has been disabled or the authorities have ordered its removal or disablement. Furthermore, they will not be liable if they store information provided by a service recipient at their request, provided that they have no knowledge of illegal activity or content and, after obtaining such knowledge, immediately remove or disable access to the illegal content. However, it should be noted that the DSA includes a ‘good Samaritan’ principle, meaning that intermediary services providers can still rely on the liability exemptions or safe harbours of the DSA when conducting investigations, implementing measures to tackle illegal content, or measures necessary to comply with applicable legislation.

Nunn: One of the concepts embraced by the EC when drafting the DSA was to include what is known informally in the US as a ‘good Samaritan’ clause; the conditional liability ‘safe harbour’ of the 2000 e-Commerce Directive. This provides some exemption for providers that proactively seek to address harmful content and non-compliance but are unwittingly compromised by bad content or a malicious author. Specifically, providers will be exempt from liability when they purely transmit information from one point to another with no other involvement, when they have no knowledge of, or reason to suspect, they are storing harmful data, and when they respond as quickly as possible to remove harmful data after being made aware of it. As with other parts of the DSA, it is unclear as of now how these exemptions will be applied in practice, but their introduction is an attempt to encourage good behaviour among online service operators.

FW: In the event of non-compliance, how significant are the penalties enshrined in the DSA?

Docquir: In the event of non-compliance with the obligations of the DSA, a fine can be imposed on the intermediary service provider of up to 6 percent of its annual worldwide turnover in the preceding financial year. The responsibility for overseeing the rules outlined in the DSA will be divided between the EC and national authorities, which are required to designate competent authorities. The DSA empowers the EC with supervisory authority, primarily for VLOPs and VLOSEs, which includes investigatory powers and the ability to impose fines. Penalties may be imposed more expeditiously at a cross-border level.

Nunn: The EC can apply fines up to 6 percent of an organisation’s worldwide annual turnover if there is evidence of a breach of DSA obligations, failure to comply with interim measures or a material breach of commitments. There is also the potential of a daily fine of 5 percent of the organisation’s average daily global revenue for continued non-compliance. To put these penalties in context, the EC can fine organisations up to 4 percent of global revenue for non-compliance with the GDPR, and fines reaching over $1bn were seen in 2023 for some large organisations. In reality, only an extremely serious breach would warrant the maximum punishment from the EC. Even so, with some of the world’s largest online providers reaching $1bn in daily global revenue, providers should not risk the consequences of non-compliance.

Jankov: For intermediary services, the DSA has established a network of digital services coordinators, which are independent authorities responsible for supervising in member states and determining penalties under national laws. Apart from fines for non-compliance, companies in scope should not underestimate reputational risk. Although the reputational consequences of making missteps under the DSA are as yet unknown, it is expected that enforcement efforts and sanctions will gain widespread media coverage and attention.

FW: How important is it for companies to carry out a risk assessment to determine their potential exposure to the DSA?

Jankov: Developing an adequate compliance methodology is crucial, given the reach and potential penalties of violating this complex law. Periodic follow-up assessments are also crucial to adapt to changes that may occur over time. This may be the case if, for example, an organisation introduces new intermediary services triggering new or expanded obligations, or if the number of active platform users increases, causing a potential shift into the VLOP category with additional obligations.

Nunn: Identifying whether an organisation falls under the DSA is critical at an early stage. Article 1 provides an outline of the types of organisations that need to comply, but organisations should seek the advice of a legal specialist for assurance. Most importantly, organisations will need to assess the services they offer which may fall under the Act. Determining whether offerings will meet the definition of intermediary services – now or in the future – is key to determining DSA compliance requirements, in addition to clarifying elements such as subscription size and geographic coverage. However, there should be a clear delineation between a DSA compliance assessment, and the assessment required of large operators to validate their processes and techniques under article 26 of the Act. Article 26 outlines a far more complex requirement, and is applied only to larger organisations in tier four.

Docquir: Considering the severe sanctions that can be imposed on an intermediary service provider, it is crucial for companies to verify whether they fall into this category and what compliance provisions apply. Further, as the DSA distinguishes between different categories, depending on the services offered, companies must assess whether one of their services would fall within the scope of another category and make sure all requirements of both categories are fulfilled. Even if at first glance a company’s main services are outside the scope, it could still be subject to DSA requirements to some extent, for example if one service, which is not minor or ancillary, falls within a specific DSA category.

Companies should start by verifying whether the DSA applies to them and to what extent.
— Benjamin Docquir

FW: What advice would you offer to companies on specific actions they may need to take to avoid penalties? What considerations do they need to make with regard to platform design, for example, including how advertisements are shown, and traders and consumers are dealt with?

Nunn: There are several key areas that organisations will need to consider in order to comply with the DSA. First is governance. The DSA requires a number of designated contacts, with processes and procedures to govern the operation of services falling in scope of the act. Second is operations. Some organisations will need to consider how they have structured their operations, and possibly the design or functions of their services. The DSA clearly prohibits certain behaviours – such as dark profiling – and there will be no way of evading this ‘cornerstone’ principle. The third area is systemic risks. Critical service providers, or those with a significant market share, will also need to consider if their services could present a systemic risk should they be misused. This will only affect the largest or more specialised online providers, but may be the hardest to resolve in the event that compliance is questioned.

Docquir: Companies should start by verifying whether the DSA applies to them and to what extent. This assessment should be conducted regularly, especially when the company expands, introduces additional services or makes changes to existing services. After identifying the relevant requirements of the DSA, a company should draft a roadmap of priority actions to take. This may include reviewing terms and conditions, establishing a single point of contact for users and authorities, or redesigning the online interface to comply with the DSA’s transparency obligations. The advertisement transparency requirement may, for example, be addressed by adding an icon with the word ‘ad’ next to any form of online advertising on their platform.

Jankov: Organisations should develop compliance and risk management methodologies, taking into account any content moderation policies or methods they might already apply. Although under the DSA there is no general obligation for intermediary service providers to monitor information they transmit or store, businesses should assess the possibility and benefits of voluntarily monitoring their content. For active monitoring, organisations can leverage machine learning tools to analyse content. These algorithms can identify potential issues early on and automatically flag them for analysis. Organisations should also consider leveraging digital risk assessment tools for DSA compliance. Carrying out mock scenario exercises which test and train processes, plans and people helps to build resilience. These scenarios will be informed by reputational risk mapping to identify hypothetical scenarios for which there is significant value in testing responses.

FW: With the majority of the DSA’s operative provisions coming into force on 17 February 2024, do you believe companies are moving quickly enough to ensure their compliance efforts are on track?

Docquir: Many companies that qualify as an intermediary service provider will have prepared themselves sufficiently in advance of the provisions coming into force and taken the necessary measures to comply with the DSA well in advance. However, given the high number of new EU laws, we believe the DSA is not currently a priority for many companies, and insufficient measures have been taken to comply with its requirements. Further, companies should be aware that it is necessary to continuously monitor whether they fall, or in the near future would fall, in scope of the DSA, so they can take steps to comply in due time.

Jankov: We have seen many Big Tech companies moving swiftly to implement DSA requirements and reorganise their internal capacities to support compliance. However, there are many intermediaries that are not Big Tech players that may not be fully aware of their obligations. To avoid inadvertently violating the law, all providers of digital services should carefully assess whether the DSA applies to them and implement a compliance framework accordingly. Once a robust compliance and preparedness framework is in place, it is also vital to test process effectiveness and team readiness to react to the range of scenarios that could arise as a result of the DSA.

Nunn: Organisations designated as VLOPs, with over 45 million subscribers, have already been onboarded to the early requirements of the DSA. When the EC began building its transparency database last year, organisations designated as VLOPs needed to submit data immediately. The next milestone date is 17 February, when all organisations that fall under the DSA will have to submit data, including decisions around content moderation. This is an important milestone because the largest organisations have been preparing for the new requirements for months. Smaller organisations, especially in industries with little previous regulation and subsequently limited skills and resources, have been slower to react and may not yet realise the impact of compliance on their daily operations. But the clock is ticking, and the EC has been clear that it expects compliance.

 

Ana Jankov is a senior director at FTI Technology based in Brussels. She is a data protection lawyer with 15 years of experience. She has worked as an attorney at top-tier law firms, including as partner in a major full-service law firm. She focuses on the GDPR and advises on its application and the implementation of privacy programmes. Her work includes data privacy due diligence and gap analysis, which includes drafting data processing agreements and advising on cross-border data transfers. She can be contacted on +32 (4) 6096 2318 or by email: ana.jankov@fticonsulting.com.

Daniel Nunn is a seasoned threat and risk management expert with extensive experience addressing diverse cyber security needs, from policy and compliance management to operational delivery and data-led transformation programmes. Formerly a senior vice president at Bank of America, he played a crucial role in creating a global threat defence function, garnering regulatory praise and US Patent Office recognition. Prior to this, he served as principal cybersecurity technical specialist at the Financial Conduct Authority (FCA). He can be contacted on +44 (0)20 3077 0353 or by email: daniel.nunn@fticonsulting.com.

Benjamin Docquir leads the IP, IT & data team at Osborne Clarke in Brussels. His expertise encompasses technology law, data privacy and digital regulation. His practice is a mix of transactional, advisory and contentious work for both international and Belgian clients. He supports clients through the many faces of digital projects. He is a recognised expert in data protection and digital regulation, with hands-on experience in the life sciences and financial services industries. He can be contacted on +32 (2) 515 9336 or by email: benjamin.docquir@osborneclarke.com.

© Financier Worldwide


©2001-2024 Financier Worldwide Ltd. All rights reserved. Any statements expressed on this website are understood to be general opinions and should not be relied upon as legal, financial or any other form of professional advice. Opinions expressed do not necessarily represent the views of the authors’ current or previous employers, or clients. The publisher, authors and authors' firms are not responsible for any loss third parties may suffer in connection with information or materials presented on this website, or use of any such information or materials by any third parties.