Activists demand TikTok reform as platform fails to protect young users
WORLD
7 min read
Activists demand TikTok reform as platform fails to protect young usersActivists and rights groups say TikTok is failing teens and demand global reforms — including a ban on targeted ads and algorithmic safeguards. TikTok says it already applies teen protections worldwide.
The TikTok logo is displayed on a mobile phone. / AP
June 5, 2025

As TikTok faces renewed scrutiny over its impact on young users, activists and digital rights groups are demanding sweeping reforms to address what they call systemic failures in how the platform protects teenagers. 

Their demands go far beyond cosmetic fixes, calling for legal accountability, algorithmic transparency, and a rethinking of how digital spaces are designed for young people.

The urgency of these concerns has been amplified by the recent murder of 17-year-old Pakistani TikTok influencer Sana Yousaf, who was shot dead on her birthday in Islamabad. Her killing sparked outrage across social media, highlighting how young creators — especially women — often face not just online harassment but real-world dangers. 

Though the tragedy is not directly tied to TikTok’s platform design, it has added pressure on the company to address broader safety risks for vulnerable users.

“TikTok must shift from being addictive by design to embracing safety by design,” Teresa Barrio Traspaderne, Amnesty International’s campaigner on children and young people’s digital rights, told TRT World.

‘Hyper-personalisation is the problem’

At the core of the criticism is TikTok’s algorithm, which personalises content based on user behaviour — what they watch, how long they watch it, and what they engage with. 

Amnesty International’s research found that even automated accounts simulating 13-year-olds were served potentially harmful content —  including videos that romanticise, normalise, or encourage depression, self-harm, and suicide —  just after five to six hours on TikTok. 

Manual research indicated that within just three to 20 minutes, over half the recommended videos already focused on mental health struggles.

Amnesty is calling on TikTok to: ban targeted advertising for all users under 18, globally, stop default personalisation of content for teens in its ‘For You’ feed, unless they explicitly opt in, introduce safeguards to prevent algorithm-driven “rabbit holes” that repeatedly push harmful material and conduct proper child rights impact assessments, which it has failed to do so far.

“TikTok still hasn’t meaningfully addressed the core rabbit hole problem. The steps it has taken are mostly cosmetic,” the Amnesty International spokesperson said. She lamented that the company continues to rely on old tools like feed refresh options and screen time dashboards, without tackling the underlying issue of hyper-personalisation that drives engagement at the cost of well-being.

She also highlighted TikTok still has not conducted a child rights impact assessment — a basic responsibility under international human rights standards — meaning it has yet to properly evaluate or address the risks its platform poses to children’s well-being. 

Technology experts echo these concerns. US-based Bibhu Pattnaik, who is a digital strategist and journalist at Benzinga, a leading media and data technology platform, emphasised the need for systemic, platform-level changes.

“Stronger default privacy settings for teenagers should be implemented by platforms such as TikTok, which should also limit algorithmic exposure to addictive or harmful content and regularly provide mental health prompts,” he told TRT World

He added that TikTok ought to collaborate with specialists to create features that promote healthy screen habits and prioritise emotional well-being on a global scale.

Digital rights advocates: ‘safety by design, not profit by design’

Digital rights activist Nighat Dad, who leads Pakistan’s Digital Rights Foundation, agrees that TikTok — and similar platforms — need to overhaul how they approach safety, especially for young and marginalised users.

“TikTok must stop treating gender abuse as collateral damage,” she said. “Safety by design, especially for women and young users, should be a core principle — not a reactive fix.”

Dad emphasised the importance of localised content moderation in native languages and cultural contexts, real-time response systems for hate speech and harassment, transparent algorithmic practices to understand how harmful content trends gain traction and investment in regional trust and safety teams that understand on-the-ground realities.

“There has to be a shift from reactive moderation to proactive, systemic protection,” she added.

What should governments do? 

While several countries — including Australia, France, and Switzerland — are pushing to ban or restrict social media access for younger teens, activists caution that blanket bans miss the point. 

According to Amnesty, such policies risk cutting young people off from vital digital spaces, without addressing the core harms.

“Banning social media simply shifts the burden onto young people, rather than holding companies accountable,” said Traspaderne, advising governments to introduce and enforce strong regulations that require safety-focused platform design, limiting data harvesting from minors, and banning targeted ads to children.

Growing global momentum

Across social media platforms, users and advocates continue to voice concern. Hashtags like #SaferSocialMedia and #ProtectYoungUsers have gained traction, especially in the wake of tragic cases tied to TikTok use. Reddit threads and Twitter posts reveal widespread unease, not only about content exposure but also about how deeply the app shapes teenage identity, behaviour, and mental health.

Ethiopian digital rights advocate Eden Tadesse posted, “TikTok can be toxic and addictive for children and young people… The platform has taken action to respect the rights of young people in Europe, but it is not doing so in the rest of the world,” urging global users to support ongoing reform efforts.

In the aftermath of Sana Yousaf’s killing, Pakistani user Shehroze, who analysed TikTok content, wrote: “There are hundreds of Sana Yousufs out there, who get bullied, stalked, harassed… Authorities and content moderators fail them every day by turning a blind eye to their pleas.” 

Prominent commentator Ayesha Ijaz Khan also weighed in, warning of the risks young children face on platforms like TikTok: “I see young children with smartphones… even before they are allowed to [be on social media], and I can’t help but think of all that could go wrong.”

Similarly, digital rights group Digital Rights Foundation (DRF) condemned the spread of gendered hate speech and misinformation following Sana’s death, calling for platforms to act decisively against online abuse.

For many activists, TikTok is just one example of a broader pattern — where engagement-driven business models conflict directly with the rights and safety of young users.

“Tech companies have had years to fix these problems,” said Dad. “If they won’t act voluntarily, then governments must step in — because children’s lives and futures are at stake.”

Recent studies back up these concerns. Research by the Center for Countering Digital Hate, a British-American not-for-profit company, found that TikTok's algorithm pushes harmful content — like videos promoting eating disorders or self-harm — to teens every 39 seconds. Another 2023 study conducted in Greece linked frequent TikTok use to rising anxiety, depression, and poor sleep in adolescents, with girls especially vulnerable. Experts warn the platform is becoming a digital “pressure cooker” for young users already struggling with mental health.

Despite this, TikTok's popularity continues to soar. In 2024, the platform added approximately 100 million new users, bringing its total to around 1.6 billion active users worldwide.

TRT World reached out to TikTok to see what they had to say.

In response, TikTok said it offers a “much more restrictive” experience for teens compared to adults, including global safeguards and educational tools. However, the company did not directly address specific concerns raised by rights groups — such as calls for a global ban on targeted advertising to minors or an independent child rights impact assessment. TikTok referred to its youth safety and well-being measures for details on existing policies.

RelatedDoes TikTok Know You Better Than You Know Yourself? - TRT Global
SOURCE:TRT World
Sneak a peek at TRT Global. Share your feedback!
Contact us