t Apple’s last WWDC (its annual developers conference) Apple continued it’s strategy of making a lot of noise about user privacy. However, they are also using machine learning in the effort to make their services better. In order to differentiate themselves as a company that still respects privacy, while processing large amounts of user data they are claiming to use a technique called Differential Privacy. (See https://www.wired.com/2016/06/apples-differential-privacy-collecting-data/)
Differential Privacy, in my understanding, involves the client devices adding “noise” to the user data, which makes it hard to link the real data back to a person. Is supposed to be a more reliable method then anonymizing user data.
The quality of the new services/features will become clearer overtime, but the fundamental problem, of course, is familiar: transparency. We only have Apple’s word here about the quality of their protection of the user’s privacy. There is no independent method of determining how much privacy is being protected and its in Apple’s best interest to claim that their implementation protects user privacy to a high degree.
I expect that in the coming years other companies will likewise provide services that claim to protect user privacy, and I welcome that, but in order to really take their claims seriously, their implementations need independent verification. Otherwise we are back to simply trusting the 3rd party, which is exactly what techniques like Differential Privacy are supposed to guard against. Thats why open source services - which are open to scrutiny - are so important.
Right now there is only one open source company that is close to offering a similar service and thats Mycroft AI
Their long-term goal is to create a strong general purpose AI. Right now, however, their focus seems to be on IOT and natural language. They have released some of the software already and are going to ship some open hardware reference designs this year. Their current projects are here:
It is a very interesting project and a step in the right direction over the choice of closed source AI assistants. If successful, Mycroft AI should be able to offer what no-one else is offering right now: transparency with regard to privacy. With Google, Microsoft, Facebook, and Amazon, you know your data is being mined with little regard for privacy, with Apple they might be making the correct trade offs (who knows). Our data, privacy, and freedom is threaten in all these cases. But being Open Source (LGPL and GPL I think) Mycroft AI’s use of data is potentially transparent. If they say they are using differential privacy, for example, to retain user privacy, independent researchers can scrutinize that claim. We might can get a better idea about how much data we are giving up in order to get better services.