Antitrust

What to expect during dawn raids under the DMA, the DSA, and the draft AI Act

In light of recent information about new probes under the DMA and the DSA, and the growing interest in enforcement of the draft AI Act, we decided to explore what investigatory powers are available to the European Commission and local authorities under these regulations, and more specifically how the enforcing authorities can conduct dawn raids to collect information about practices infringing the said regulations. This note examines what triggers such inspections, who can be inspected, and how such inspections differ from what we know in competition law enforcement.

 

Introduction & key observations for those subject to the new European regulations

The digital market is rapidly evolving, and businesses must continually adapt to keep up with the latest regulation and requirements. With the recent introduction of the Digital Markets Act (“DMA”), the Digital Services Act (“DSA”) and the draft Artificial Intelligence Act (“AI Act”) in the European Union (“EU”), companies operating in the digital space are facing new challenges that require careful consideration and compliance.

The DMA, DSA and the proposal for the AI Act are three significant pieces of EU legislation that confer new and extensive powers for the European Commission (“Commission”) as well as local competition and regulatory authorities. One of the enforcement mechanisms under these acts is the power to conduct on-site inspections (dawn raids). Many of the investigative powers resemble the powers of the Commission under competition law. They include the ability to search for incriminating documents, interview staff members, and request the production of documents and digital data. Similarly, rules concerning the confidentiality of business information, the presence of external counsel, legal professional privilege and other fundamental rights of an inspected company that were established under competition law remain valid under the new regulations.

On the other hand, the catalogue of inspection powers is detailed as it considers the specific objectives of the new regulations. For instance, the enforcement of these regulations may require better understanding a company’s business model, its procedures and goals, the use of IT systems and algorithms and how data is processed.

In summary, companies regulated by the new regulations should carefully consider adjusting their internal dawn raid policies. For the first time, EU law explicitly enables the Commission to gain access to algorithms, which means that dawn raids conducted under the new regulations will require appropriate technological understanding from the authorities, and from the legal departments that must identify the limits of an inspection mandate.

 

Scope of the Acts: DMA, DSA, and AI Act

The scope of the new regulations sets out the purposes for which the authorities can conduct dawn raids. Below, we briefly describe these pieces of legislation.

The DMA, which came into force on 6 March 2024, aims to prevent large online platforms that connect consumers with content, goods and services from abusing their market power. This is done through strict regulation of big technology companies, the designated gatekeepers of the digital economy.

The DSA applies in all EU Member States as of 17 February 2024. It regulates online intermediaries and platforms to prevent illegal and harmful activities online and the spread of disinformation. It is designed to ensure user safety, protect fundamental rights, and create a fair and open online platform environment by means of, among other things, rules on content moderation, online targeted advertising, online interface settings and online marketplaces.

On 13 March 2024, the Parliament also approved the proposal for the AI Act, designed to:

  • improve the EU internal market by prescribing clear requirements and obligations for reliable and ethical business;
  • balance innovation with the protection of individual rights and interests;
  • ensure safe AI systems on the EU market;
  • enhance the monitoring and enforcement of fundamental rights and safety requirements; and
  • propose enforcement actions after an AI system is placed on the market.

The AI Act essentially regulates the creation and use of AI systems in the whole of the EU.

 

Application of the Acts

The new regulations apply to a host of different entities.

The DMA applies to core platform services provided or offered by so-called gatekeepers. A gatekeeper is an undertaking which has a significant impact on the internal market, and which provides core platform services designated pursuant to Article 3 of the DMA. The Commission has, thus far, wasted no time by designating six gatekeepers: Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft. All designated gatekeepers can therefore be subject to inspections under the DMA.

Even though the DSA applies to all online intermediaries and platforms in the EU, the matter of inspections is most relevant for very large online platforms (VLOPs) and very large online search engines (VLOSEs). However, effective enforcement of the DSA regulations may require inspections not only of providers of very large platforms and search engines, but also of other economic entities cooperating with them (to that effect, see Art. 67(1) DSA).

The AI Act will apply to suppliers, users, importers, distributors, product manufacturers, authorised representatives, affected persons within the EU, and third countries that place AI systems on the market, put them into service or use generic AI models. See more in this article on ‘Artificial Intelligence and Competition law: shaping the future landscape in the EU’.

 

Key investigative powers

The DMA and the DSA

The Commission is the sole authority competent to enforce the DMA. As part of its duties under the DMA, pursuant to Art. 23 DMA, the Commission may conduct inspections of any undertaking or association of undertakings. The inspections might involve entering the premises of the relevant undertaking, examining the books or records related to the business (how these are stored is irrelevant), taking copies of such books or records, demanding access to and explanations about a company’s organisation, functioning, IT systems, algorithms, data handling, and business practices, and sealing any business premises and records to the extent and for the duration of the inspection. Details of the Commission’s powers are set out in Art. 20-23, 30, and 38 DMA.

Should the Commission suspect a DSA infringement, under Art. 69(1) DSA it can conduct all necessary inspections at the premises of the provider of the very large online platform or very large online search engine concerned, or of another entity as referred to in Art. 67(1) DSA. There are two types of inspections under the DSA: (1) based on a written authorisation (Art. 69(4) DSA) or (2) based on a decision (Art. 69(6) DSA). Inspections pursuant to Sections 4 and 6 differ as to whether there is a legal obligation on businesses to be subjected to an inspection. Also, to set out the detailed arrangements for the conduct of such inspections, the Commission can adopt the implementing regulation under Art. 83 DSA (unlike in the case of the DMA – Art. 46 thereof does not foresee any implementing regulation).

Under the DSA, EU Member States are required to appoint a Digital Services Coordinator (DSC), which has extensive powers of investigation. Under Art. 51 DSA, in respect of providers of intermediary services falling within their jurisdiction, the national Digital Services Coordinators have the power to carry out, or to request that a judicial authority in their Member State, order inspections of premises to examine, seize, take, or obtain copies of information relating to a suspected infringement. Similarly to the Commission, the national Digital Services Coordinators can also require any member of staff or representative of those providers to give explanations on any information relating to a suspected infringement.

 

The AI Act

In relation to governance of the AI Act, there will be a new body with powers within the Commission: the European AI Office (“AI Office”). The AI Office will oversee the enforcement and implementation of the AI Act with Member States. The AI Office has now been established by a Commission decision of 24 February 2024.

However, Member States hold a key role in the application and enforcement of the AI Act. Each Member State is supposed to designate one or more national competent authorities to supervise the application and implementation, as well as to carry out market surveillance activities. The current draft of the AI Act gives these s0-called market surveillance authorities access to business data to monitor compliance. Under Art. 63(2) of the draft AI Act, these authorities will annually report “any information identified in the course of market surveillance activities that may be of potential interest for the application of Union law on competition rules” to both the Commission and the national competition authorities. The obligation to share information applies regardless of whether or not there are suspicions of anti-competitive conduct. This has important implications for competition law, particularly for companies marketing high-risk AI systems,[1] as those companies have to comply with various obligations and risk being investigated by the national market surveillance authorities. Investigations can also take place to ensure safe development of the testing of high-risk AI systems in real-world conditions outside AI regulatory sandboxes. Under Art. 54a(2) of the draft AI Act, providers or prospective providers may conduct testing of high-risk AI systems in real-world conditions at any time before placing them on the market or putting them into service in an AI system on their own or in partnership with one or more prospective deployers. Under Art. 54a(5a), in conjunction with Art. 63a of the draft AI Act, Member States must confer on their market surveillance authorities the power to require the provision of information, to carry out unannounced remote or on-site inspections, and to perform checks on the development of testing in real-world conditions and related products.

 

Different regulations, common challenges

It remains to be seen how frequently such dawn raids/inspections will be conducted and how the authorities will approach the notion of whether they have reasonable grounds for suspicion.

Should an inspection occur, the inspectors can request, among other things, access to and explanations about the entity’s organisation, functioning, IT system, algorithms, data-handling, and business practices (Art. 23(2)(d) DMA; Art. 69(2)(d) DSA). The Commission can also be accompanied by additional persons (auditors and experts, members of the national competent authorities) when conducting inspections.

Under Art. 26(2) DMA or Art. 72(2) DSA, the Commission can appoint auditors or experts. They assist the Commission in monitoring the effective implementation and compliance with the relevant provisions of the Regulation and provide the Commission with specific knowledge. When appointing auditors, the Commission needs to ensure that such experts are rotated to guarantee their independence and impartiality (Rec 85 DMA, Rec 143 DSA). However, a company under investigation should be aware that the role of auditors and experts has not yet been precisely defined. Neither the DMA nor the DSA sets out the criteria to assess the required minimum expertise or independence.

 

The AI Act – a new frontier in cartel enforcement?

With the AI Act, national competition authorities will gain new powers to access company information. Because market surveillance authorities will have the power to conduct compliance checks in light of the AI Act and will be obliged to share any relevant information with (national) competition authorities, the AI Act will likely have the effect that competition authorities will check companies’ compliance with competition rules more frequently, irrespective of any suspicion of a breach. This also means that competition authorities will have access to sensitive information that they would otherwise not have in their possession.

Overall, authorities are likely to launch investigations more often and become more proactive. As Margrethe Vestager, European Competition Commissioner, stressed back in 2017, competition authorities should be alert to cartels that use software to operate more efficiently. If such tools enable companies to enforce or discipline their cartels more effectively, this could be reflected in the fines they may receive. The use of self-learning algorithms, such as those in search engines and self-driving cars, also presents challenges. The complexity of these algorithms makes it difficult for people to understand how an (unintended) outcome was arrived at. However, there is a strong belief in the EU that companies should be held accountable for the consequences of the algorithms they use. This means that businesses must understand how these algorithms work and what impact they may have on competition, and also what risks they run in relation to enforcement of the new EU regulations.

 

[1] The AI Act sets out rules that differ depending on the risk level posed by the AI system: low and minimal risk (little obligations), high-risk (AI systems for critical infrastructure / education), unacceptable risk (such as biometric categorisation systems) and specific transparency risk.

Story originally seen here

Editorial Staff

The American Legal Journal Provides The Latest Legal News From Across The Country To Our Readership Of Attorneys And Other Legal Professionals. Our Mission Is To Keep Our Legal Professionals Up-To-Date, And Well Informed, So They Can Operate At Their Highest Levels.

The American Legal Journal Favicon

Leave a Reply