TikTok is under investigation in the UK for how it handles the safety and privacy of young users. UK Information Commissioner Elizabeth Denham told a parliamentary committee on Tuesday that the popular short-form video app potentially violated GDPR rules that state that technology companies must have different rules and protections for children, reported The Guardian. The UK began its probe on TikTok back in February, shortly after the FTC fined the app for child privacy violations.
Among the concerns listed by Denham was TikTok's open messaging platform, which allows adult users to freely communicate with children. "We are looking at the transparency tools for children," said Denham to the committee. "We're looking at the messaging system, which is completely open, we're looking at the kind of videos that are collected and shared by children online. We do have an active investigation into TikTok right now, so watch this space."
TikTok, which is owned by Chinese company Bytedance, is facing criticism worldwide for how it prioritizes the safety of its core audience of mostly teenage users. A report from the British charity Barnardo revealed that predators were targeting users as young as eight with sexually explicit messages. While the app requires that users be 13 years of age or older in order to post content, most young users can easily bypass that age-gate. The app was briefly banned in India and Indonesia after concerns that it fails to protect its young user base from predators and abusers. A Buzzfeed investigation revealed that TikTok's failure to address sexual predators on its own platform has led to users taking action on their own, creating call-out videos and flocking to other platforms to alert other users of abusers.
If the UK finds TikTok to be at fault, it could face fines of up to 4 percent of its revenue, or £17.9 million (or about $22.6 million). Engadget has reached out to TikTok for comment on the investigation and will update when we hear back.