It's no secret TikTok is popular among young people. As of 2021, Statista estimates 25 percent of the app's users fall into the 10 to 19 age bracket. They're the platform's lifeblood, creating some of its most memorable videos. So it makes sense then that TikTok is taking additional steps to protect those individuals. On Wednesday, the company said it had joined the Technology Coalition, an organization that includes the likes of Apple and Google and that works towards protecting children from online sexual exploitation and abuse.
"We hope to deepen our evidence-based approach to intervention and contribute our unique learnings from addressing child safety and exploitation," TikTok says of its reason for working with the Technology Coalition. As part of the move, the company will join the organization's board and several committees devoted to pushing forward protections for children. Working with experts on the issues that affect its app has been TikTok's primary approach to addressing problems like online child abuse. As just one recent example, the company announced earlier in the year it would work with the National Eating Disorders Association to promote body inclusivity.
With today's announcement, TikTok also pointed to some of the safeguards it has in place to protect minors. By default, the accounts of individuals between the ages of 13 and 15 are set to private, and only people over the age of 16 can use the app's live streaming and direct messaging features. The company introduced many of those measures after the US Federal Trade Commission fined it $5.7 million in 2019. At the time, the agency said TikTok had failed to handle the data of children properly. "There is no finish line when it comes to protecting the TikTok community," the company said. "We work each day to learn, adapt, and strengthen our policies and practices to keep our community safe and we look forward to building on all of these efforts through our partnership with the Technology Coalition."