The exchange of child sexual abuse-related content through TikTok is an issue being investigated by the U.S. Department of Justice.
According to the Financial Times, the US Department of Homeland Security (DHS) is investigating TikTok over how the platform handles child sexual abuse content. In addition, the U.S. Department of Justice (DoJ) is also investigating how TikTok’s security features were exploited by criminals.
The DHS investigation began when a child rights researcher reported to TikTok that child sexual abuse content was rampant on the platform. The bad guys choose TikTok because the platform has a large number of young users.
Although TikTok has 100,000 content moderators worldwide, the moderator team struggles to control video content. From 2019 to 2021, investigations into TikTok-related child abuse cases increased sevenfold.
In addition, the U.S. Department of Justice is also investigating how bad actors exploited TikTok’s “Only Me” feature. Law enforcement found that child sexual abuse (CSAM) content was even spread through personal accounts.
The Financial Times reported that bad actors were using individual accounts to trade CSAM videos by sharing account passwords. Bad guys will only post CSAM content as me – only the account holder will see it. The account will be shared or sold to others.
A TikTok representative said they had cooperated with authorities on this and removed accounts and content linked to CSAM.
“TikTok does not allow child sexual abuse content. When we detect any attempt to post, exchange or distribute (CSAM), we will delete the content, lock accounts and devices, and immediately notify the National Children’s Center (NCMEC), if necessary, Please cooperate with law enforcement,” TikTok said in a statement.
Neither the Department of Homeland Security nor the Justice Department has reached a conclusion on the investigation. Penalties, if any, are also not made public. The Financial Times said it was unclear how these latest investigations would affect TikTok’s operations.