Will it be possible for users to keep data private but allow a machine learning algorithm to incorporate that data, or will machine learning only be possible with public data?
We haven’t ventured into that area yet, but I believe most of it would happen in the App layer. So, if you trust an app(eg: open source apps where the code is verifiable) and grant perms to it accordingly, it should be possible to do with private data.
It seems that trust in the app will be quite important, as, otherwise, the app to which I give permissions to may be able to copy or somehow expose my private data. Or is that somehow prevented by the network?
This will likely always be the case, unfortunately. It’s like bitcoin wallets, you do need to trust them as they can empty your coins.
Homomorphic encryption is really coming on these days, so permissions can be much more granular then.
So we take all this into “account” by saying apps should never need you to have “an account”.
tl;dr apps can steal stuff, however greater granularity in data policies and homomorphic encryption (in conjunction) help.
It is prevented by your permissions.
If you grant access to your pvt data, you need to trust the app your granting permissions to.
Its your responsibility to be sure you only grant access to safe apps.
Enigma’s Secret Network does what you are looking for
But it seems it will still be important for users to be vigilant in terms of managing their data policies. Safe affords users greater control (and granularity) in that process, and that is a big advantage.
The machine shall learn…
Safe Network Token
Massive Array of Internet Disks Secure Access For Everyone