TikTok Allegedly Leads Child Accounts to Explicit Material Within a Few Clicks
As reported by a recent investigation, the widely-used social media app has been discovered to direct profiles of minors to pornographic content within a small number of clicks.
Testing Approach
An advocacy group set up fake accounts using a birthdate of a 13-year-old and activated the app's "restricted mode", which is meant to limit exposure to adult-oriented content.
Study authors discovered that TikTok recommended inappropriate and adult-themed search terms to seven test accounts that were set up on unused smartphones with no search history.
Alarming Recommendation Features
Search phrases proposed under the "recommended for you" feature included "extremely revealing clothing" and "explicit content featuring women" – and then escalated to terms such as "hardcore pawn [sic] clips".
Regarding three of the accounts, the inappropriate search terms were proposed instantly.
Quick Path to Pornography
After a "small number of clicks", the study team came across explicit material ranging from exposure to penetrative sex.
Global Witness claimed that the content tried to bypass filters, usually by presenting the content within an benign visual or video.
Regarding one profile, the method took two interactions after logging on: one click on the search function and then a second on the suggested search.
Legal Framework
Global Witness, whose mandate includes investigating technology companies' influence on public safety, reported performing several experimental rounds.
One set occurred prior to the implementation of minor safety measures under the British online safety legislation on the 25th of July, and additional tests subsequent to the rules took effect.
Alarming Results
Investigators noted that multiple clips featured someone who appeared to be a minor and had been reported to the Internet Watch Foundation, which tracks exploitative content.
The campaign group alleged that the social media app was in breach of the digital protection law, which requires social media firms to prevent children from viewing harmful content such as pornography.
Government Position
An official representative for the UK communications regulator, which is charged with regulating the legislation, stated: "We acknowledge the effort behind this study and will examine its conclusions."
Official requirements for following the act indicate that digital platforms that present a medium or high risk of presenting inappropriate videos must "configure their algorithms to remove dangerous material from minors' content streams.
The platform's rules ban explicit material.
Platform Response
The social media company announced that upon receiving information from the organization, it had deleted the violating content and made changes to its suggestion feature.
"Immediately after notification" of these claims, we took immediate action to investigate them, delete material that contravened our rules, and implement enhancements to our search suggestion feature," commented a spokesperson.