TikTok is adding a number of new tools to its suite of parental controls that enable caregivers to help protect their kids while they’re tikking and tokking. The new features include the ability to block specific search terms, users and hashtags, as well as limiting who can comment on their clips. More importantly, adults can limit the discoverability of the content, keeping it for friends rather than the wider world. Rounding out the list is the ability to prevent other people seeing what videos the younger user has liked.
The social platform launched its parental controls earlier this year under the guise of Family Pairing, in which a teen user’s account is linked to an adult’s. The first set of controls included limiting time on the app and the option to disable direct messages through the platform. And, if a user is under 16, then direct messages and other features are deactivated by default, as part of the company’s commitment to guarding against child exploitation.
Part of this is down to TikTok’s grand mea culpa after being fined $5.7 million in 2019 by the FTC for violating the Children’s Online Privacy Protection Rule, or COPPA. At the time, authorities said that the platform had collected data from children under the age of 13 and let them broadcast their location. Earlier this year, it received criticism from a number of groups saying that it hadn’t done enough to protect underage users.