ByteDance’s DMA Compliance Workshop – The Power of No: A Walk in the Park
The Digital Markets Act (DMA) became entirely applicable on 7 March 2024. By then, the gatekeepers issued their compliance reports documenting their technical solutions and implementation of the DMA’s provisions under Article 11 DMA as well as their reports on consumer profiling techniques as required under Article 15 DMA (see here).
I will be covering the workshops organised by the European Commission (EC) per each of the gatekeepers under The Power of No series, where the representatives of the undertakings meet with stakeholders to grind their compliance strategies and solutions. This blog post covers the fifth workshop organised by the EC for assessing ByteDance’s compliance solutions. A full review of the rest of the workshops can be found here on Apple’s, Meta’s, Amazon’s and Alphabet’s participation.
One core platform service and the missed deadline for compliance
ByteDance, the parent company of the video-sharing and viewing platform TikTok, was quite upfront in terms of the expectations that the European Commission should deposit on it towards effective compliance. Given that the gatekeeper was only designated with respect to one of its services, namely its online social networking service TikTok, most of the provisions are not applicable to its existing business model and their application is not justified to previous conduct they have historically engaged in. As ByteDance’s representative put forward in their introductory remarks, TikTok is the only undertaking from those designated by the EC that has not been subject to previous sanctioning proceedings under EU competition law concerns as opposed to the rest of the gatekeepers.
Those same arguments build upon the narrative that ByteDance presented before the EC to rebut the quantitative presumption of designation (as pointed out here and here) as well as the reasons that ground the appeal before the General Court against that same designation decision (see here). According to the gatekeeper, TikTok is an entertainment platform acting as a challenger to the incumbent gatekeepers in the social networking spectrum.
Even though those arguments may gain some traction with the General Court at the stage of the appeals, the truth is that ByteDance’s obligations in relation to compliance are not less intensive as a result. The General Court explicitly rejected ByteDance’s request to suspend the effects of certain provisions of the DMA in the wake of the compliance deadline set out for March 2024. Therefore, ByteDance was forced to fully comply with the provisions applicable to online social networking services by that time.
Notwithstanding, ByteDance’s presentation of its compliance solutions came short of delivering on the expectations that both EC and stakeholders held with regard to expected effective compliance. At the time of writing, ByteDance has not yet produced the publicly available (or confidential, for that matter) version that would satisfy the formal requirement of delivering on the compliance deadline its report on the auditing of its consumer profiling techniques, enshrined under Article 15 DMA. In fact, it recognised as much in its intervention in the compliance workshop.
Regarding its compliance with the substantive obligations under Articles 5 and 6 DMA, ByteDance did not provide much guidance for reference in terms of the substantive efforts that have gone (no doubt) into designing its technical implementation of the regulatory framework. The compliance workshop revolved around two panels organised into two sets of provisions. On one side, those obligations aimed at promoting contestability in enabling the porting of data and data access under Articles 6(9) and 6(10) DMA. On the other side, those prohibitions engrained into the regulation that relate to data combination under Article 5(2) DMA.
Data portability and data access: straightforward verification of authorised third parties
ByteDance went through – nearly word by word – its compliance report when setting forward its compliance plan relating to Articles 6(9) and 6(10) DMA. In that regard, ByteDance’s compliance workshop was more of a replica of the previous discussion that it had already engaged in within its compliance report than a substantive engagement with the stakeholders participating in the room.
Compliance with Article 6(10) was quite straightforward for the gatekeeper, insofar as it defended in the workshop that it already complied with the provision. To this end, ByteDance demonstrated that its current analytics functionalities already enable business users to access granular data on their interactions and in-app user engagement.
In terms of Article 6(9) DMA, ByteDance defended its compromise with data portability via its existing tools. For instance, in the gatekeeper’s view, the possibility for end users to download and share their videos to other platforms directly from its own app was in itself a great step toward bridging the gap for more portability. In a similar vein, TikTok upheld that its already existing Download Your Data (DYD) functionality substantially catered portability options directly to end and business users to port their data.
As a response to the DMA’s application, however, it has introduced two sets of implementations. First, it has improved the speeds of data access available in its DYD tool, so that data can be made available for selected categories of data in seconds or minutes and for the full data package in estimated minutes or hours. Similarly, it has also significantly improved the granularity that end users can engage in when selecting the types of data that may be ported from the DYD tool. Second, it has introduced the Data Portability API, built on the foundation of the DYD infrastructure, to facilitate the porting of data actioned by third parties who are authorised directly by end users.
The Data Portability API, according to ByteDance, would work as follows: the third party will have to first invest and build its own product and app to be able to call the API for the end user’s ported data. In other words, the third party seeking access to data to improve contestability on the market would have to first invest in a solution that can only be completed through the ported data and then follow the steps to gain its authorisation directly with ByteDance. In principle, ByteDance upheld that the process of authorisation applied to third parties would only follow two particular purposes: to identify whether the third party is who he claims to be (to allow bad actors from gaining access to end user data) and to make sure that the data that they are calling the API for corresponds to the use case they will be applying it to. By this token, ByteDance requires third parties to complete a dedicated form to the former purpose and mock-ups of the user experience within their app or website so that verification can be completed.
Asked by several participants of the workshop, ByteDance confirmed that the verification process for the third parties’ access to the Data Portability API would approximately take 3 to 4 weeks to complete first-off. Once verification is completed, then third parties will be able to perform calls of all types within the Data Portability API, be that of a one-off request of portability or of repeated requests over time. As opposed to the limitations imposed by other gatekeepers, ByteDance will enable third parties to perform recurring requests to the API within the period of 1 year upon the end user’s initial authorisation. In response to questions of the stakeholders, third parties outside of the EU will also be able to retrieve data in this manner, although it is yet unclear whether data from end users not located outside of the EEA will be accessed via the Data Portability API.
Pantomime compliance: a blind approach towards Article 5(2) DMA
As I have already defended in previous posts and in one of my most recent papers, squaring the circle of the de facto prohibition engrained in Article 5(2) is no mean feat for the gatekeepers. It is true that those prohibitions may be overridden by the exemption of the end user’s granting of effective consent to perform those same tasks, but demonstrating that those choice screens are, indeed, effective in the sense of Articles 4(11) and 7 GDPR places a high burden of proof on the undertakings. However, in my own mind, ByteDance did not even make an effort to substantively comply with the provision. Perhaps this strategy follows the fact that ByteDance’s expectations were placed on achieving the suspension of the provision’s effects at the stage of their interim measures’ request before the General Court.
Be that as it may, the gatekeeper came wholly unprepared to demonstrate its effective compliance with Article 5(2) DMA. In a similar vein to Article 6(9) DMA, the gatekeeper considers that it complied with the provision prior to the regulation’s application due to two fundamental reasons.
First, because it does not have different designated CPSs to leverage personal data across its ecosystem. This argument may be poignant at face value, but deceptive once one gets to grips with the provision. Despite that ByteDance TikTok was the only CPS contained in the first round of the EC’s designation decisions in September 2023, Article 5(2) does not apply solely to CPSs, but to the combination of data across its own proprietary services and with data of third parties. On this front, ByteDance was unable to respond to several questions posed by different stakeholders trying to figure out its compliance strategy. For instance, when asked about whether it considered that TikTok Ads remains separate from its TikTok CPS, ByteDance’s representatives were unable to respond, despite that their compliance report advocates for their integration and following the gatekeeper’s subsequent notification to the EC of TikTok Ads as an additional CPS in early March 2024. Given that TikTok would, thus, be the all-encompassing CPS comprising all of its services, ByteDance also failed to respond to stakeholders when asked about its compliance solutions to implement technical controls and safeguards to silo data as a default from, at least, data coming from third party services. To this end, ByteDance has only proposed to fine-tune the sign-in for end users when accessing its video-editing tool CapCut. On this note, ByteDance dodged to respond on whether data stored on CapCut from other social networking services would be siloed from TikTok’s personal data.
Second, ByteDance defended that its existing prompts were already quite robust in providing effective choices to consumers. However, ByteDance only put forward a single prompt relating to the end user’s capacity to authorise personalised ads vis-à-vis generic ads on its social networking service. The immediate impact of the end user not consenting to this particular prompt was that advertising would no longer be based on a personalised experience, but on high-level categories such as the end user’s language and region. The prompt, therefore, bears no impact on compliance with the rest of the mandates contained under Article 5(2) to cross-use, process or combine personal data across CPSs and other services.
Relating to the design and display of the prompt, ByteDance was unable to respond to questions formulated by stakeholders asking whether the consumer could basically ignore the prompt and whether personalised advertising would apply by default on the app. Similarly, ByteDance did not highlight that it had previously tested the choices presented to the end users to design the presentation of consent in an effective manner abiding by the exemption under Article 5(2) DMA.
Key takeaways
ByteDance’s compliance workshop was one of the most blatant demonstrations that gatekeepers may stall compliance with the DMA if they believe that the Courts will be on their side at the end of the discussion. Despite it is true that the gatekeeper proposed a couple of technical solutions to comply with the obligations of the regulatory framework, the undertaking’s interpretation of the prohibition on data combination was wholly performative and this same conclusion was made evident throughout the Q&A held with participants to the workshop.
In the end, the DMA’s effectiveness will also lie with the EC’s capacity to counteract the smoke screen compliance strategies of those gatekeepers that seek to undermine the regulation’s desire for speed and effectiveness.