So the other data isn’t encrypted, right? If the identity would not be encrypted, how could that lead to de-anonymisation? Could the client request the names of all the people to whom the data belong? Or can the client ask “do you have the medical data of patiënt X?”, and get a binary answer? And then, “does patient X have disease Y”?
Right now I work for a company that has a significant database of privacy-sensitive non-structured data. Because identities (names) can occur anywhere in the data, it is non-trivial to properly anonymise this data. For that reason, company policies forbid the usage of cognitive systems to analyse this non-structured data. I am curious if zk-SNARK algorithms could be used to design a system in which privacy is guaranteed while still allowing such analysis on this data.
I am far from knowledgeable on zk-SNARK. It would seem that such algorithms are not theoretically a bad call for your company’s problem, but I don’t know how feasible it is to run such an algorithm on non-structured data.