"It's weird to think that less than 100 years ago, less than 10 percent of households in the UK had electricity," Harkins explains. "But once it hit a tipping point, it just took off." The comparison is meant to draw a parallel between the spread of electricity and the growth of the internet and consumer comfort with big data: Most of today's consumers are either uncomfortable sharing their data online or are otherwise ignorant of how their data is being used. He's not talking off the cuff, either: According to a recent survey Intel conducted, 65 percent of device owners aren't sure who has access to their data or how it is used, and a staggering 84 percent simply assume by default that data is collected without their knowledge and sold to third parties.
Although many of those surveyed weren't comfortable sharing their data with large companies or research information, Harkins admits that the survey is visibly split by generations. Older generations were cautious about sharing medical information that would be "used to benefit society," but millennials seemed more willing to dismiss their concerns. "They [couldn't] care less if their medical records were open to the world," he explains -- the youth are more concerned about protecting their text messages and photos than their medical or purchase history. "Still, we have a lot of people -- 45 percent -- who are willing to share data if it will benefit society," he says, "but it has to be anonymized." People want to help if they keep their privacy.
We know, of course, that consumers have grown cautious over privacy and data security -- we now have smartphones designed specifically to protect our info, and the actions of major corporations and government entities now live under the microscope -- but Intel argues that this distrust is hurting the progress of the so-called third industrial revolution. Companies need to find a way to overcome privacy concerns and create an environment that allows the public to trust that their data is protected, and won't be misused. If they can, we can build services that personalize education, help diagnose and cure diseases faster or simply find a parking spot by leveraging the power of open, anonymized data. Achieving this means changing our entire perspective on how we deal with privacy. "Privacy needs to be more than a check-the-box function," Harkins says. "It needs to be a true business function."
Privacy isn't something we can tack on, he explains further. It's something we need to build from the ground up. He uses the commercial racing industry as an example. "Think of race cars -- they achieve such speed and they take enormous risks, but how can they do that?" They're designed to, he says -- everything from the layout of the racetrack, the way drivers are trained, how pit crews interact with vehicles: It was all designed for speed and safety. As a result, spectators don't worry about their safety at a race because they trust that the system was built to protect them. Companies hoping to deal in big data have to do the same thing: Build trust by building their entire company to support privacy and data security.
How companies are going to accomplish this -- and what kind of racetrack to build, specifically -- is another matter. If the internet was a movie, Harkins says, we'd be in the opening credits. We still don't know how things are going to pan out over the lifespan of the technology. Intel doesn't have a simple answer, but it's trying to get the conversation going. At a small luncheon in San Francisco this week, the company assembled experts on big data to talk about the issue and brainstorm ways to help consumers trust big companies with their data. "People want large-scale analytics," Danny Weitzner, co-founder of TrustLayers and former deputy chief technology offer of Internet Policy for the Obama administration, told the panel. "But they're afraid and concerned their data is going to be misused. They're worried that they might have higher insurance rates because they didn't get 10,000 steps on their Fitbit, or they'll get an unfair price on a product they want."
So, what will get the public to start trusting companies with their personal data? Weitzner thinks it needs to go beyond the promises of individual companies. "I think the trust will come when people know there is a legal framework that is protecting them," he explains. That won't let companies off the hook, he says, but people need to know that there is a system in place and that the system is going to be mandated and followed. Transparency needs to be a default standard, not a choice offered by consumers. If such a system was in place, he says, last year's NSA shakeup may not have been such a big deal. "It was a real failure of the US government to not have found a way to be more transparent," he says. "We have to have a way to make sure that people have confidence that the rules are being followed -- what upset people is that they were all of the sudden being told that there are no rules, or that the rules were just being ignored!"
Intel's security day didn't end with a solution, but with a declaration of fact. "We have a lot of work to do," Intel's David Hoffman, the panel moderator, concluded. He's right; corporations have a long way to go before the general public will trust them to treat data ethically. Intel plans to hold more events in the future to further the conversation: Finding a balance between protecting user privacy and levering the power of big data just might be the internet's next evolution.