The TikTok app logo can be seen in this photo illustration taken on November 18, 2024 in Warsaw, Poland.
Null Photo | Null Photo | Getty Images
As governments around the world seek to protect children from the harms of social media, the European Union plans to crack down on social media companies and target the “addictive design” features of TikTok and Instagram.
European Commission President Ursula von der Leyen said Tuesday at the European Summit on Artificial Intelligence and Children in Denmark that the region will take action against certain features on social media platforms by the end of the year.
CNBC approached ByteDance and meta For comments.
“We are taking action against TikTok and its addictive design: endless scrolling, autoplay, push notifications. The same applies to meta, as we believe Instagram and Facebook are not enforcing their own 13-year-old minimum age,” von der Leyen said.
“We are investigating platforms where children can go down ‘rabbit holes’ of harmful content, including videos promoting eating disorders and self-harm,” she added.
Von der Leyen said the EU’s executive branch has also developed its own age verification app with “the highest privacy standards in the world”.
Member states will soon be able to integrate this into their digital wallets, making it easy to enforce on online platforms. “There are no more excuses. Age verification technology is available,” the EU chief said.
The European Commission could prepare a legal proposal as early as the summer as it awaits the advice and findings of the Special Committee of Experts on Online Child Safety.
US oppression
The EU has cracked down on American Big Tech companies over the past year, enacting legislation aimed at making tech giants more accountable. The hefty fines have drawn criticism from U.S. officials who have warned that blockchain risks missing out on participation in the AI economy.
US President Donald Trump is battling fines against US companies totaling more than $7 billion over the past two years.
applemeta, and googleare among the companies that have been fined for blockchain antitrust and competition violations, which they are appealing.
In February, President Trump signed a memorandum of understanding to consider imposing tariffs “to combat digital services taxes (DST), penalties, practices, and policies that foreign governments impose on U.S. businesses.”
Earlier this year, the European Commission launched an investigation into Elon Musk’s X (formerly Twitter) over the spread of non-consensual sexually explicit content of women and children generated by the company’s Grok chatbot.

The increased legal scrutiny over child safety on social media platforms comes after Meta and YouTube lost a high-profile court ruling in the United States in March that found design features such as infinite scrolling and autoplay were contributing to addiction and mental health problems among teenagers.
Most recently, the European Commission found that Meta violated EU digital services law by failing to keep under-13s off its platform, and a preliminary investigation found that minors could easily evade checks.
Meanwhile, social bans on under-16s have gained attention among governments around the world, with Australia becoming the first country to impose a blanket ban in December. Several European countries, including Spain, France and the UK, have proposed their own legislation to keep children away from social media.
