Fair digital interaction
Privacy is becoming more of a concern these days as people realize how their data is being handled - background collection of profiles, sharing of data between organizations and so on. I think there is an alternative to that model.
Let me bring up a very simple example to use throughout this post. You have an ID in your pocket and would like to go and buy some hypothetical alcohol for the sake of this example. The store has to check whether you are above a certain age, 18 or 21 years old. The information has to be confirmed by a government authority, hence the ID.
Selective sharing
Consider this. You have information private to you - your ID, and you have someone require processing this data - checking your age. The obvious way is to reveal your whole ID information to the employee working at the store. This leaks the information that is not relevant for the interaction. In digital systems, if that information has leaked you are done - the risk of uncontrolled replication is high. It will likely be stored in some database, sent for analytic purposes or outright attached to your profile.
Now imagine if you could prove you're old enough to buy imaginary alcohol without showing your full birth date, address, and photo to every merchant. How would you do that? In a digital system that could be modeled as a structured query. Merchant sends a query to your "ID" in the form of "Are you above a certain age?", and you could respond by looking at your data and answer yes or no. Just kidding here, there has to be some sort of verification.
Signature as a form of verification
Instead of simply saying "trust me, I'm older than 21!", like your typical company says "trust us, we don't sell your data", you could prepare a document that states "to the best of my knowledge, I am older than 21 as of that date" and sign it, then go to authorities and let them sign that too by looking at your ID.
Nice, looks like this is somewhat solved - you have an official document to buy some wine with a stamp and everything. Use that for every interaction with alcoholic beverage stores. Of course they might think you are crazy, but you have a solid privacy stance - you are selectively sharing only what's necessary.
In fact, in digital systems that can be simplified. Your digital ID is signed by authorities after every update to the information, and the result of the query issued to your equipment is signed by your own private key. Every response is confirmed to be the correct answer to the simple binary question.
Zero knowledge at core
You might lose your ID too. Hopefully not, but things like that happen sometimes. The power of information replication in digital systems, unlike copying your physical ID, can come to the rescue here. Although we need to protect that information somehow before replicating.
This is a solved problem in privacy space. Apply end-to-end encryption and have the data synchronization service don't know what that information is. Easy! Now you have multiple copies of your IDs - one at home, one carried daily and one at work or something.
Here, the service provider acts as a guardian of the user's data. They make sure that it is replicated for redundancy, synchronized reliably and such. They may as well protect the end-user from running malicious queries, attempts to gather information, etc. As well as act as an identity provider.
The beauty of this model is that the service provider is serving the user, not hunting for precious data. They become more of a coordination, protection and identity management layer, rather than data processing service.
Positive trust
I don't believe in zero trust. Life typically requires trust in something or someone, otherwise you might break down mentally. Moreover, it's practically impossible to implement technically. Someone has to trust someone. In this case, you should trust your service provider for storing your random bytes, and government authorities for confirming your age, yet the alcohol store has to trust the imaginary stamp on your one-time document stating that you are above 21 years old.
Digital signatures are incredibly hard to forge, making this trust model somewhat good. Why should the trust not go in another direction too, giving peace of mind to the regular folks? Instead of companies saying "trust us handling your data", the users should say "trust me in the correctness of my data". With some clear boundaries and verification of course. This model gives users genuine agency over their information - they decide what to share and when to share it. The tech is there, it is just not applied in this context widely. That I don't understand. Perhaps companies have found the data harvesting model too profitable to abandon.