An analysis discovered that the algorithm TikTok employs to determine which videos viewers will see promotes sexual content, drugs, and alcohol to children as young as 13. A ’13-year-old user’ on the China-based social networking platform searched for ‘onlyfans’ and watched a handful of videos, including two advertising erotica, as part of the Wall Street Journal investigation. The same young user was presented a succession of sexually related films when accessing the ‘for you’ page, the TikTok analogue of the Twitter feed.
The content displayed on the for you page is based on previous searches, as well as the most often viewed or time spent watching content kinds. Despite the user’s age being indicated in their profile, the more sexual content was shown in the ‘for you’ page the longer the user lingered on it. TikTok stated that they do not currently distinguish between videos given to adult and child accounts, but that they are working on a new filter mechanism for the younger accounts.
One of the accounts, purportedly belonging to a 13-year-old, included 569 films regarding drug usage, including references to cocaine and meth addiction, as well as advertising videos for online drug sales. Over 100 movies promoting pornography sites and sex stores were also shown to the WSJ accounts from accounts labelled as adults only.