Brussels — The European Commission is facing intensified calls from lawmakers, digital rights groups, and EU member states to invoke strict digital regulations against TikTok for alleged systemic non-compliance with platform safety and transparency requirements. The renewed pressure centers on concerns including child protection, the spread of disinformation, opaque recommender systems, and failure to meet obligations under the Digital Services Act (DSA), the European Union’s landmark 2024 digital regulation regime.

Calls for Enforcement and Investigations Escalate
Over recent weeks, several MEPs and data protection advocates have publicly criticized TikTok’s handling of youth-directed content, weak age verification, and algorithm-driven amplification of harmful material. Representatives from multiple EU capitals have urged the Commission to take decisive action, including initiating formal investigations, imposing fines, and exploring interim measures under the DSA’s enforcement toolkit for Very Large Online Platforms (VLOPs).
A coalition of digital safety NGOs wrote to the Commission:
“TikTok’s ongoing failure to safeguard children and mitigate algorithmic risks warrants full DSA enforcement, including sanctions and corrective orders.”
Legal Basis: The Digital Services Act and Platform Accountability
The Digital Services Act, fully in force across the EU since 2024, imposes tiered obligations on online intermediaries, with the most stringent duties reserved for platforms designated as Very Large Online Platforms (VLOPs). As a service with over 45 million monthly users in the European Union, TikTok falls into this category, triggering heightened responsibilities including:
- Conducting systemic risk assessments relating to content and algorithmic amplification
- Mitigating systemic risks to minors, fundamental rights, and civic discourse
- Providing regulators with access to internal data, logs, and recommender parameters
- Maintaining transparency reports and independent audit results
- Cooperating with national authorities on compliance enforcement
Critics argue that TikTok’s reported practices, especially its algorithmic promotion of addictive short-form video content and insufficient safeguards for young users, fall short of these requirements.
Regulatory Mechanisms at the Commission’s Disposal
Under the DSA, the Commission may pursue a range of enforcement tools tailored to address systemic non-compliance by VLOPs, including:
- Formal investigations accompanied by data access and compulsory reporting
- Corrective orders requiring specific fixes to recommender system design
- Interim measures where imminent harm is identified
- Fines of up to 6% of global annual turnover for confirmed breaches
- Independent audits to verify compliance
- In extreme cases, temporary restrictions on platform features within the EU
Legal analysts emphasize that the Commission has rarely used the full suite of DSA enforcement mechanisms, making any significant action against TikTok a potentially precedent-setting move.
Member State Voices Amplify Pressure
Several EU member states, including France, Germany, and Spain, have publicly backed tougher enforcement. Health ministries and parliamentary committees in these capitals have cited concerns about:
- Rising rates of digital addiction among minors
- Spread of extremist or self-harm content
- Insufficient content moderation transparency
- Lack of parental consent mechanisms
Officials argue that TikTok’s global content policies must be adapted to meet EU legal standards, and not simply regional labeling or age gating.
TikTok’s Response and Legal Position
TikTok has asserted its commitment to child protection and regulatory compliance. In recent statements, the company has noted:
- Implementation of AI-driven age estimation tools
- Default safety settings for under-18 users
- Expanded human moderation teams
- Localized transparency reports for EU audiences
However, critics, including EU regulators, maintain that these measures are insufficiently documented, inadequately enforced, and fail to satisfy DSA risk-mitigation criteria.
A TikTok spokesperson said:
“We engage constructively with EU authorities and remain committed to meeting regulatory requirements while respecting user rights and platform innovation.”
Legal Debate: Algorithmic Transparency and Fundamental Rights
At the core of the DSA enforcement debate is the question of algorithmic accountability. The law treats recommender systems as intrinsic to platform risk architecture, requiring VLOPs to identify, assess, and mitigate foreseeable harms resulting from algorithmic amplification, especially where it intersects with children’s exposure and political content.
Legal scholars note that TikTok’s opaque recommendation logic may violate DSA provisions if regulators determine that:
- The platform failed to conduct adequate systemic risk assessments
- Mitigation measures were not effective and proportionate
- Transparency obligations were unfulfilled
This legal framing positions TikTok’s compliance debate not just as youth safety or content moderation, but as a systemic digital governance obligation with enforceable consequences.
Possible Outcomes and Sanctions
Should the Commission proceed with formal enforcement under the DSA, possible outcomes include:
- Fines amounting to hundreds of millions (or more), calculated under turnover-based penalties
- Mandatory redesign of algorithmic features
- Binding corrective orders with compliance deadlines
- Independent audits and follow-up reporting requirements
- Interim restrictions on specific platform capabilities within the EU
In the most severe scenario, regulators could impose feature suspensions until compliance is demonstrated, a step rarely taken but legally available under the DSA.
Broader Implications for Platform Regulation
A strong enforcement action against TikTok would signal a shift toward active oversight and power to hold global platforms accountable for digital harms under EU law. It would also clarify how the DSA’s risk-mitigation and transparency duties apply in practice, particularly for algorithm-driven services with major youth user bases.
Industry observers note that other platforms with similar algorithmic architectures may also face intensified scrutiny if the Commission moves decisively. As pressure mounts on the European Commission to act, the legal dispute over TikTok’s compliance with digital safety norms underscores a larger trend: sovereign regulators are asserting authority over platform governance with unprecedented scope.
Whether the Commission invokes the full weight of the Digital Services Act remains to be seen, but the current momentum suggests that TikTok may soon face its most rigorous legal challenge in Europe to date, one that could reshape how social media platforms operate within the EU.
Timeline: European Enforcement Pressure on TikTok Under Digital Law
April 2024
European Commission finalizes the Digital Services Act (DSA) enforcement framework, outlining responsibilities and penalties for Very Large Online Platforms (VLOPs) including algorithmic risk mitigation and transparency obligations.
June 2024
EU digital safety regulators send early compliance notices to algorithm-driven platforms, including TikTok, emphasizing child protection, harmful content controls, and transparency duties under the DSA.
September 2024
France’s digital safety authority launches a preliminary inquiry into TikTok’s age-verification processes and content moderation gaps affecting minors.
November 2024
German data protection regulators issue notices questioning TikTok’s handling of user metadata and cross-border data transfers for EU users.
January 2025
European Parliament passes resolutions urging stricter enforcement of platform accountability standards, singling out short-form video platforms with high youth engagement, including TikTok.
March 2025
EU consumer and youth advocacy groups deliver formal petitions to the European Commission calling for investigations into TikTok’s alleged systemic compliance failures on DSA standards.
May 2025
The Commission requests detailed systemic-risk assessment documentation from TikTok relating to recommender system harms, algorithmic transparency, and mitigation measures.
August 2025
Preliminary regulatory findings indicate gaps in TikTok’s compliance with content moderation thresholds, particularly for self-harm, misinformation, and hate speech associated content.
October 2025
Spanish data protection authorities coordinate with EU regulators to raise concerns about TikTok’s privacy controls and age-verification accuracy, increasing inter-agency scrutiny.
December 2025
EU regulators issue formal compliance deadline notices, signaling that systemic risk obligations and transparency failures must be remedied or enforcement actions may follow.
January 2026
Lawmakers and digital-rights coalitions intensify public calls for the European Commission to initiate DSA enforcement proceedings against TikTok for continued non-compliance.
February 2026
Commission officials confirm they are reviewing evidence and legal arguments for formal enforcement actions, including fines, corrective orders, and possible interim restrictions under DSA authorities.
Early 2026 (Ongoing)
Legal preparedness by digital safety coalitions and national regulators continues, anticipating formal enforcement measures and potential court challenges once the Commission acts.
