Meta has been fined €405 million ($402 million) by the Irish Data Protection Commission for its handling of children’s privacy settings on Instagram, which violated Europe’s General Data Protection Regulation (GDPR). As Politico reports, it’s the second-largest fine to come out of Europe’s GDPR laws, and the third (and largest) fine levied against Meta by the regulator.
A spokesperson for the DPC confirmed the fine, and said additional details about the decision would be available next week. The fine stems from the photo sharing app’s privacy settings on accounts run by children. The DPC had been investigating Instagram over children’s use of business accounts, which made personal data like email addresses and phone numbers publicly visible. The investigation also covered Instagram’s policy of defaulting all new accounts, including teens, to be publicly viewable.
“This inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information private," a Meta spokesperson told Politico in a statement. "Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can’t message teens who don’t follow them. We engaged fully with the DPC throughout their inquiry, and we’re carefully reviewing their final decision.”
The fine, which Meta could still appeal, comes as Instagram has faced intense scrutiny over its handling of child safety issues. The company halted work on an Instagram Kids app last year following a whistleblower’s claims that meta ignored its own research indicating the app can have a negative impact on some teens’ mental health. Since then, the app has added more safety features, including changing default settings on teen accounts to private.
Updated with additional details from the DPC.