A campaign group claims TikTok’s algorithm pushes pornography and sexualised clips to young users. Researchers created fake child accounts, enabled safety settings, and still received explicit search prompts. These led to videos showing masturbation simulations and pornographic sex. TikTok says it took immediate action once informed and insists it remains committed to children’s online safety.
Fake accounts expose harmful material
In July and August, researchers from Global Witness built four TikTok profiles. They pretended to be 13-year-olds by using false birth dates. The platform did not ask for further identity checks. Investigators activated TikTok’s “restricted mode”. The company promotes this feature as a filter for mature or sexual themes. Despite that, the accounts received sexualised prompts under “you may like”. These suggestions led to videos of women flashing underwear, exposing breasts and simulating masturbation. At the most extreme, investigators found full pornography hidden in seemingly innocent clips.
Global Witness voices concern
Ava Lee from Global Witness described the results as a “huge shock”. She argued TikTok not only fails to block harmful content but also recommends it to minors. Global Witness normally investigates the influence of technology on human rights, democracy and climate change. The organisation first stumbled upon TikTok’s explicit content during unrelated research in April.
TikTok highlights safety measures
Researchers informed TikTok earlier this year. The company said it deleted the flagged content and introduced fixes. But when Global Witness repeated its test in late July, sexual videos appeared again. TikTok insists it has more than 50 protective tools for teenagers. It claims nine out of ten violating clips are removed before anyone watches them. After the latest report, TikTok said it improved search functions and removed more harmful content.
New rules demand stricter safeguards
On 25 July, the Children’s Codes under the Online Safety Act came into force. These rules require platforms to carry out strong age checks and prevent children from accessing pornography. Algorithms must also block harmful material linked to suicide, eating disorders and self-harm. Global Witness conducted its second study after the rules took effect. Ava Lee urged regulators to step in, saying children’s online protection must now be enforced.
Users question recommendations
During the investigation, researchers observed user reactions. Some complained about sexualised search suggestions. One wrote: “can someone explain to me what is up with my search recs pls?” Another asked: “what’s wrong with this app?”
