The brief mention of differential privacy at WWDC earlier this month seemed to be in line with Apple's image. When senior VP of software engineering Craig Federighi talked about the ways in which the company would continue to prioritize user privacy, he indicated that the use of the privacy technique would improve Apple's predictive services while keeping user identities safe. According to a Recode report, the company has now said that its differential private algorithm will come with an opt-in feature.
Even though this particular data gathering method protects users in theory, it would eventually be up to the user to participate when the algorithm is introduced with MacOS Sierra. The technique, which is a well-established mathematical process employed by surveyors and statisticians, is expected to make Apple's text, emoji and deep link suggestions better. In addition to the clarification about the privacy technique, the company also said that images from a user's cloud storage are off-limits and are not being used to feed and improve image recognition algorithms. While Apple's precise image-studying practices and other AI training ways are not known, the company is clearly stepping up its predictive algorithm game.
Read more about differential privacy from the co-creator of the technique here.