Latest in Culture

Image credit: Getty

Apple's differential privacy algorithm will require you to opt-in

But theoretically, you should reap the benefits regardless.

163 Shares
Share
Tweet
Share
Save
Getty

The brief mention of differential privacy at WWDC earlier this month seemed to be in line with Apple's image. When senior VP of software engineering Craig Federighi talked about the ways in which the company would continue to prioritize user privacy, he indicated that the use of the privacy technique would improve Apple's predictive services while keeping user identities safe. According to a Recode report, the company has now said that its differential private algorithm will come with an opt-in feature.

Even though this particular data gathering method protects users in theory, it would eventually be up to the user to participate when the algorithm is introduced with MacOS Sierra. The technique, which is a well-established mathematical process employed by surveyors and statisticians, is expected to make Apple's text, emoji and deep link suggestions better. In addition to the clarification about the privacy technique, the company also said that images from a user's cloud storage are off-limits and are not being used to feed and improve image recognition algorithms. While Apple's precise image-studying practices and other AI training ways are not known, the company is clearly stepping up its predictive algorithm game.

Read more about differential privacy from the co-creator of the technique here.

Mona is an arts and culture journalist with a focus on technology. Before moving to New York City for a masters program at Columbia Journalism School, she was the associate editor of Platform magazine in Delhi, India. She has covered dance music extensively and is a proponent of drug policy reform. On weekends, when she’s not watching post-apocalyptic films, she spends hours contemplating life as a Buddhist.
163 Shares
Share
Tweet
Share
Save
Comments

From around the web

ear iconeye icontext file