On December 12, 2025, the Digital Markets Act (“DMA”) High-Level Group (“HLG”)[1] endorsed a joint paper on the regulatory interplay on AI-related issues.[2] This paper assesses how to best implement the different legal frameworks that govern AI systems. It underlines the importance of achieving a coherent and consistent implementation of these frameworks and of the cooperation between competent authorities to achieve it.
I. Contestability and fairness in AI value chain
In this paper, the HLG first highlighted that the DMA could contribute to ensuring that the AI value chain remains contestable and fair insofar as it could apply to gatekeepers when they deploy AI technologies in their respective services and ecosystems.[3] The HLG however also called out that there is still a “need to clarify to what extent, AI services are covered by the DMA in its current form, and how the current set of obligations would apply to such services.”
In this context, the HLG identified two main areas where the DMA could contribute to this ensuring a fair and contestable AI value chain:
- Access to AI infrastructure and distribution:
The HLG set out that AI services heavily rely on cloud computing infrastructures, which the HLG asserts are concentrated among very few players. In addition, gatekeepers are rapidly integrating AI, such as AI assistants and chatbots, which increasingly act as gateways for digital services to reach end-users.
The HLG called out that this increasing level of integration could constitute a significant obstacle for third parties to offer their own AI services. The DMA could be used to ensure that gatekeepers do not act in a way that leads to them self-preferencing their own services, locking users in by limiting effective interoperability, or harming online content providers in relation to their traffic and visibility.
- Access to Data:
The HLG then highlighted the importance of data in the AI value chain, in particular user interaction data. The HLG explained that gatekeepers often have exclusive partnerships and access to first-party data, which is typically personal, proprietary, not available to third parties, but also subject to data protection legislation that governs how this data can be processed. This access to data can create an “unmatchable” data-driven advantage over third parties, which is “exacerbated” by “network effects and feedback loops”, according to the HLG.
These two areas are at the heart of ongoing review of the DMA, for which the EC ran a public consultation earlier this year. The announced goal of this review is to “identify potential areas for improvement” and “consider emerging challenges,” including the deployment of AI-powered services by gatekeepers.[4]
As a preliminary comment on the results of this consultation, Antoine Babinet (Deputy Head of Unit at DG COMP) stated during a hearing of the European Parliament’s Internal Market committee on December 4, 2025, that “Cloud and AI are the big topics.” He also explained that “access to data is key,” and that many respondents to the consultation think that the DMA already contains provisions that can play a role in that regard. He also mentioned that cloud services were a recurrent topic in the responses because “cloud is really a key infrastructure to provide AI services.”
In line with these considerations, in November this year, the EC opened three market investigations under the DMA into cloud computing services: two of these will assess whether Amazon Web Services and Microsoft Azure should be designated as CPSs under the DMA. The third market investigation is more general in that it will assess if the DMA can effectively address practices that could limit the competitiveness and fairness of the EU cloud sector.[5]
Interestingly, the EC did not open any market investigation into the AI sector under the DMA. It may be for several reasons. First, contrary to cloud computing services, AI systems are not a defined category of CPS in the DMA. Relatedly, there are many difficulties in delineating what could even be an AI service deserving of its own CPS category – arguably, AI is a technology that can be integrated into a myriad of digital services rather than a service in and of itself. Second, there are also uncertainties around if and how the DMA can govern AI systems, for instance, if and when chatbots can be defined as search engines under the DMA[6] and as such, benefit from certain provisions such as Art. 6(11) DMA.[7]
II. Regulatory interplay and enforcement fragmentation
Second, the paper assesses the interplay between the different legal frameworks that govern AI-related issues,[8] and the enforcement gaps that exist in that regard. The HLG identified several areas of interplay, such as the regulation of data portability, dark patterns, and transparency (rules of the provision of services; algorithmic explainability; advertising; and editorial independence). In relation to AI integration, the HLG highlighted the interplay between the DMA and competition law, which can each govern different aspects of AI systems. And indeed, the Commission showed a strong willingness to apply general competition law rules to the AI sectors by recently opening two formal investigations under Art. 102 TFEU into Meta and Alphabet respectively for AI-related practices.[9]
Because the applicable legal frameworks have different scope and objectives, but may govern the same market participants and similar conducts, incoherences arise in their implementation. In addition, different authorities at both the EU and national levels are in charge of implementing these rules (sometimes, several authorities being in charge of implementing the same sets of rules). This leads to enforcement fragmentation, and as such risks of inconsistent decisions and rulings. All this creates significant legal uncertainty and unnecessary overhead for companies subject to these legal frameworks, which in turn can stifle innovation and be counterproductive.
Aware of these issues, the HLG listed the following actions as being the most pressing to achieve regulatory consistency and coherent compliance, and address enforcement gaps:
- Align the requirements of different regulations (e.g., to allow companies to provide all information required under various transparency obligations while keeping the volume of information manageable).
- Ensure coordination between authorities to achieve a more coherent and effective enforcement, and avoid contradictions (e.g., between the DMA’s goal to reduce information asymmetry and the GDPR’s principle of data minimization).
- Consider enacting a coherent terminology and operational definitions across legal instruments.
The paper notes that further work is needed to develop a common understanding among competent authorities of AI-related issues and ensure that their enforcement practice reflects it. In line with the general EU principle of sincere cooperation, the DMA HLG presents itself as a forum “allowing enhanced coordination and cooperation between the relevant regulatory bodies” that compose it.
This joint paper is one of several reports and regulatory initiatives that assess the overlaps and interplays between different regulations of the digital sector and strives to find solutions to ensure their coherent implementation. For instance:
- The Commission recently issued a report on the interaction of the DSA with other legal instruments (including the DMA, AI Act, UCPD; GDPR, and many others)[10] as well as draft guidelines (adopted jointly with the EDPB) on the interplay between the DMA and GDPR.[11] Draft joint Commission and EDPB guidelines on the interplay between the AI Act and GDPR are supposed to follow soon.
- The European Parliament’s Committee on Industry, Research and Energy commissioned a study on the interplay between the AI Act and other EU digital regulations (including the GDPR, Data Act, DMA, and DSA).[12]
- The Commission presented in November 2025 its Digital “Omnibus” package that aims at simplifying the existing rules on AI, cybersecurity, and data, and that is part of a legislative project to simplify and consolidate the EU digital rulebook.[13]
All this testifies to the complexity of the legal frameworks that applies to the digital sector in Europe, and that is liable to create obstacle not only for the companies subject to it but also those who should be benefiting from these rules.
III. SME support
Lastly, the HLG highlighted in its joint paper the importance of supporting small and medium enterprises (“SMEs”) to ensure that they can benefit from the DMA, have legal certainty, and not face overly burdensome compliance barriers in relation to AI development.
To allow this, the HLG found it critical to inform SMEs of the opportunities created by the DMA (in relation to e.g., interoperability and data-related obligations). As such, the HLG indicated that the DMA team has already started reaching out to SMEs active in AI and received “positive feedback” from this outreach.
The same conclusions and actions could be taken in relation to companies active in other spheres of the digital sector and that could equally benefit from different provisions of the vast digital EU legal framework. Far from only creating legal burdens, EU digital regulations can also create many opportunities for SMEs and other companies, in particular those that are not subject to the heaviest legal obligations under the DSA or DMA for instance.
Conclusion
This joint paper from the HLG demonstrates that the Commission and other regulators are aware of the administrative and regulatory burdens that the multiple legal frameworks applicable to the digital sector can create. This is liable to create obstacles for companies to operate properly – both those that are subject to these legal frameworks, and those supposed to be benefitting from them – and as such, run counter to the very objectives of these regulations.
This joint paper, together with other similar reports and studies recently issued, are a good step in identifying the issues that providers of online services, AI-related and beyond, face in navigating in the EU legal landscape and operating in Europe. But it remains to be seen how and to what extent the legislator and competent authorities will work together to address these issues.
[1] The HLG is composed of representatives from the Body of the European Regulators for Electronic Communications (BEREC); European Data Protection Supervisor and European Data Protection Board (EDPS and EDPB); European Competition Network (ECN); Consumer Protection Cooperation Network (CPC Network); and European Regulatory Group of Audiovisual Media Regulators. Its missions include providing advice and expertise to ensure that the DMA and other sectoral regulations applicable to online platforms are implemented in a coherent and complementary manner.
[2] See publication on the Fifth meeting of the Digital Markets Act High-Level Group here.
[3] Gatekeepers are the providers of online services that are the subject of the DMA insofar as they have a significant impact on the internal market, enjoy an entrenched and durable position in their operations, and provide core platform services (“CPS”) that have been designated as such by the Commission. CPSs are services that constitute an important gateway for business users to reach end users and that meet the turnover and users threshold defined in the DMA.
[4] See Commission’s announcement of the public consultation on the review of the DMA, available here.
[5] See Commission’s announcement of its market investigations on cloud computing services under the DMA, available here.
[6] The Commission is currently actively assessing whether to designate ChatGPT as a “very large online search engine” under the DSA after ChatGPT’s user count passed the threshold for such designation. See Mlex article “EU lawmakers quiz commission over ChatGPT designation under platform rulebook”, December 5, 2025, available here.
[7] Art. 6(11) DMA addresses information asymmetry between gatekeepers and third parties by requesting search engines designated as CPS to share search data (ranking, query, click, view) with competing search engines on fair terms, subject to any personal data being effectively anonymized.
[8] The different legal frameworks assessed by the HLG include the DMA, AI Act, Data Act, General Data Protection Regulation (“GDPR”), Digital Services Act (“DSA”), Unfair Commercial Practice Directive (“UCPD”), but also competition law and consumer protection law more generally.
[9] On December 4, 2025, the Commission opened a formal investigation into Meta’s policy restricting AI providers’ access to WhatsApp (see Commission’s press release here). Shortly after, on December 9, 2025, the Commission opened a formal investigation into Alphabet for using content of web publishers and content uploaded to YouTube for AI purposes (see Commission’s press release here).
[10] See Commission Report on application of Article 33 of Regulation (EU) 2022/2065 (DSA) and the interaction of that Regulation with other legal acts, November 17, 2025 available here.
[11] See Commission and EDPB draft joint guidelines on the interplay between the DMA and GDPR, on which the Commission opened a consultation in October 2025, available here.
[12] See Study requested by the EP ITRE Committee on the Interplay between the AI Act and the EU digital legislative framework, October 2025, available here.
[13] See Commission’s press release on simpler EU digital rules and new digital wallets to save billions for businesses and boost innovation, November 19, 2025, available here. See also Cleary AI and Technology Insights, “Reset or rollback: Unpacking the EU’s Digital Omnibus Package” November 21, 2025, available here.
