
The European Commission has preliminarily found that TikTok’s design features may violate the Digital Services Act (DSA), citing concerns over
addictive elements such as infinite scroll, autoplay, push notifications, and its highly personalized content recommendations.
TikTok’s design under scrutiny
The Commission’s initial findings suggest TikTok has not sufficiently assessed the potential harm its platform may pose to users’ physical and mental wellbeing, particularly minors and vulnerable adults. Features that constantly reward users with new content may encourage compulsive scrolling, shifting users into “autopilot mode.” Research indicates such patterns can reduce self-control and contribute to addictive behavior.
TikTok reportedly overlooked warning signs, including how long minors use the app at night, the frequency of daily logins, and other indicators of compulsive use.
Measures to reduce risk fall short
According to the Commission, TikTok’s current efforts to mitigate risk—like screen time management tools and parental controls—are insufficient. Tools are easy to bypass, and parental controls require extra effort and knowledge from parents, limiting their effectiveness.
The Commission emphasized that meaningful changes to TikTok’s core design are necessary. Suggested adjustments include limiting infinite scroll, introducing mandatory screen time breaks (especially at night), and modifying its recommendation algorithms to reduce addictive patterns.
These preliminary findings are part of a broader investigation and do not determine the final outcome.
Next steps in the Investigation
TikTok now has the opportunity to respond, examine investigation files, and submit written defenses. The European Board for Digital Services will also be consulted.
If confirmed, non-compliance could result in fines of up to 6% of TikTok’s global annual turnover, depending on the severity and duration of any infringement.
Background
The investigation into TikTok’s compliance with the DSA began on 19 February 2024 and examines multiple concerns, including the so-called “rabbit hole effect” of its recommendation system, age-appropriate content risks for minors, and platform obligations for user privacy, safety, and security.
Previous aspects of the inquiry, such as access to research data and advertising transparency, were addressed through preliminary findings in October 2025 and binding commitments in December 2025, respectively. Photo by Solen Feyissa, Wikimedia commons.
