You are right. But, as I have said before, if I delete something (Facebook, website…) I’m often not looking for absolute solution, but meaningful decrease of accessibility. That possibility would be greatly lessened, when the history is open to everyone. If I delete my Facebook, it is not accessible to my friends, and that is enough for the goals I have for deleting it.
But I have to admit that I don’t understand the technical side of this all nearly well enough. Maybe there can be Facebookish app in SAFE that doesn’t allow others to read “deleted” profiles. And while it still would be possible with another app, the lessening of accessibility could be reasonable enough for most non-critical cases? (Though I still would like to have the possibilty to delete public data.)
yeah, both should be allowed, AD (lower creation/update costs, due to better cacheability and maybe as an incentive) for public website with “time-travel” & diff feature and MDs for more personal stuff, like social profiles, comments…
ADs could also count into some sort of “trust score” of a website.
You could then opt into using ADs for such a content? But it’s a complex topic…
So, if we are happy with that level, i.e. his social context, then app level “deletion” should be enough, and that is fully possible within SAFE. So all social platforms that reference your data, you tell them to not reference them.
Ah, as you also say further down:
This is the part that I think is very interesting, and that my brain is still chewing along at I am still not groking it, I get the feeling that it should be possible, but I also get the feeling that when extrapolating, I will need to create new accounts, since I run out of ways to practically reference things with the old ones, and need to put yet another layer above. And by doing that, I also get the feeling that I will need to duplicate data alot as to keep it feasible to access (i.e. keep doing snapshots of the streams, keep storing current state over and over, at new places in streams that are closer for me to reach, as the streams grows).
Might be so. But I don’t trust my imagination to be so good, that I (currently) dare say “probably”, and then actually impose that limit.
I might be able to say “probably” (and feel that it is good enough approximation) as a result of throrough dragon poking, and no more energy to poking dragons.
When it comes to waste of energy and storage, we are getting close to the discussion that went on a while ago, about ever increasing storage needs, and well…: with increasing storage capacity, it is actually a misconception that we would be increasing energy waste by keeping old data (since the better capacity, implies that it is not as costly, i.e. wasteful, to store that amount of data anymore). As capacity grows, what was once wasteful, becomes simply a very small expenditure. In which case it is perfectly justified, since the other complexity - to avoid the extra storage - might be consuming more energy in the long run. Complexity is expensive.
Now, there is no end to caveats, and storage capacity might not grow the way it has “forever” (i.e. not for as long as we would need it to for the above to be OK).
So this referencing is also going to be appended public information? There is a public data appending as follows
“Toivo hopes Facebookish to reference his data”
“Toivo hopes Facebookish to not reference his data”
So the genuine app would respect the latest state, but another app (Phasebook) could use the previous states?
Edit (or append):
And yet another app (Completebook) would put together all the data I have ever published in the SAFE -equivalents of Facebook, Grindr, Twitter, Foursquare, Google, Maps… At the moment somebody probably does that somewhere (if they have obtained the data) but at SAFE anybody could do it anywhere, because all this is open to everyone?
Maybe that is ok, as long as you have your own private Completebook -profile showing what can be known about you based on your history. We should have better idea what kind of analysis can be run on our public data and be able to run that ourselves.
Except, you can’t do that with 100% immutability. Anything you uploaded and encrypted with a key, can be decrypted and read with that key. Forever. There is no “change the keys”.
So, that is one of the main problems I stated. You can never change key of stuff you encrypted.
It is a pretty common practice to once in a while change keys. That simply cannot be done.
The account you have created, with everything important you have, you can never change password on it. So, to decrease the risk, you’ll have to (again… i.e. like now) have 1000 passwords, so that not everything is compromised when one password is.
Well, if you have private data, and you need to have it private between 3 people, and then only 2, you can not remove it. And that need, to block some access that once was possible, I think it’s fair to say that people will have reasons to want that also in the future.
This is correct and accurate in this proposal and today. If anyone ever had access to data then they always have access. But not to changes after the key changes. An import point, if you see data you cannot unsee it.
Goes for this as well. When you tell somebody something it stays told to them, if you exclude them from more convo’s then they cannot hear the new stuff. We cannot make them forget the things we did tell them if that makes sense.
So, the problem when we have lots of data, is that giving someone a key, doesn’t mean they have read it all. Even if they have read it all, doesn’t mean they remember it all.
So, this puts a limit on things that don’t exist today. How is an organisation supposed to give access to data to employees, that they are only supposed to have access to while employed?
Sure, they could many times have copied it. But that is quite different still, to have possibly been able to copy it during a few months of hire, or have guaranteed access to it forever.
That is just one such situation, there can probably be more examples.
I think deeply about this and cannot see how this is different though. It is insecure to imagine they did not copy it all. In any case I think it makes sense, take the employee who is prevented access to stuff so he does not expose it or use it in court to defend themselves and so on.
An example, I did a FOI request to our Scottish Enterprise (read thieving bar stewards, corrupt and complicit in fraud at huge scale). I know folk that work there. Anyway I was told the managers went around telling folk to delete anything with my name on it. The stuff I got back had email responses, but not the original questions (so rumbled). think
I only say this to show both sides, one where we think stuff is not copied and another where we know stuff is not withheld. The price of the latter perhaps begin worth the possible cost of the former.
It is interesting. This freedom brings responsibility and maybe it has to be forced? [
Well, that example you had was one side and probably not uncommon, but there can be plenty of ways that the perpetrator is the one who got access in the first place. Like you got hacked, you want to quick as hell change all keys. I mean just because you get hacked doesn’t mean the hacker was 100% successful or competent, and you might still have time to save some things, by changing passwords etc. Also, how can we think that we can imagine all the possible relationships and intricate ways we need to share, but only with a few not the whole world, and how those groups would change, and need to manage access. How can we judge that those situations must always be non legit? I can not dream of having such a good fantasy about human interactions.
I think it is a very blunt way to steer things, to limit ability to delete your own private data, as a way to reach more fairness and honesty. Well, I don’t know about that connection, how well that will play out.
But, it can be argued of course that we want to try it out and see if it works that way. It could.
Replacing existing internet with same features, well many of the same features will be there, it’s just a matter of knowing which to throw out and which to keep. Not everything is so easily discernible. So, not everything is going to be scrapped, and we don’t say it is boring and wrong in those cases when something is kept.
I agree, but think we need to push our boundaries and as you say, try different ways. We are not tied here at all, but an interesting path for sure. It is deep and does need huge debates. This forum is great for this.
(This thing is about immutable storage. It’s immutable up till the point where the SST tables are merged, at which point they are scrapped. But not using the reference to them any longer, would equate to “scrapping” them, in an immutable SAFENetwork.)
It’s just that, “private” is not exactly always only me, could be me and someone else, and then I can’t delete it by not having the map anymore. But the distinction is muddy.
I see the two ways currently,
One way is that private means only 1 person. The only way to delete it and be sure it is deleted is to never share, and forget the map.
And the other way is that it is always private to you if you uploaded it as private, until you explicitly set it as public, at which point it can no longer be set as private again. If you shared it while it was private, it might be copied, but you can still delete it - and the one who saw it, must actively take action if they consider themselves owners - i.e. copy it. If they did not consider themselves owners, they will not copy, and in reality the data is still private, even though it was at some point shared.