Appendable Data discussion

We can attempt to make this as easy as possible. We’re looking (as ever), to improve data APIs and wondering what sort of helper function might make sense on SAFE. A function to createMessage or what have , from a WebId could potentially automate this, and so hopefully help on this front. (Although would ‘all’ apps ever do this? Or use RDF etc… probably not, but I think attempting to make it harder NOT to do it should help at least.)

I’m not sure I follow here, @neo. Do websites need to be signed? Only if you want to prove consistent authorship perhaps. But even if not, we should (as i understand the intentions thus far), with AD (ALD… whatever we’re calling it) be able to view a complete history of changes to the PNS Resolvable Map (if we go that route), so we can see if/what has changed there re domain resolution (and onwards to page changes etc).

So I’m not sure that signing is needed, unless we want to prove that content was added by key X.

(Although if I’m missing something, there’s still scope for requiring various data in the PNS system, so things could be added if required)

4 Likes

When you make data public, it is no longer yours to restrict. It is a public good.

For the scenarios you talk about, such as sharing data with friends or family, i suspect you would not make it public at all. Instead, you would create a new persona for said group and distribute both the public and private keys associated with it. Then everyone in the group could read or write data securely to the group.

Alternatively, you could just share the private key to view the content, but keep the public key secret. This way, trusted people can view your content, but only you can create it.

From a UI perspective, it would be just like a regular invite to a group and would be trivial to manage.

Of course, people could copy your content, give away the key(s), etc, but this is the same as with other social media apps too. Moreover, you could change your keys and redistribute to the group should new restrictions be applied (removing someone, for example).

What we need to realise is that public on SAFENetwork is public in an all encompassing way. If you make something public, you lose ownership and control of it. Therefore, other ways to share with groups (like the above example), would be better where this is a concern.

2 Likes

Again, you are making an ideological argument to a practical problem. Do you really think people will accept a network where they can mistakenly upload something into a publicly viewable space that can never be deleted? I’m asking these questions because I think people often get caught up in their ideology and technical details that they forget to think about the every day Joe, the people that would really take this from a cool idea, to a worldwide ubiquitous network.

Think of a person in an oppressed country that needs to stay anonymous for their safety. If they accidentally publish something that identifies them, with no recourse to remove it, they could put themselves in danger. We are talking about billions of uneducated and not very techy people all over the world. THEY are the important ones, not us savvy technical users.

3 Likes

How do you stop someone putting the same thing on bittorrent or IPFS?

1 Like

I’m not sure I understand your point. Those are file sharing services, meant for that purpose. It would be relatively difficult to accidentally upload something to either one of those, but particularly bittorrent. I assume people that start using them understand what they are using them for. If SAFE Network is aiming for the usability of bittorrent, I’m going to go ahead and sell all my coins now.

SAFE Network will hold apps, presumably replacement apps for things like Facebook, that everybody will interface with each other on. If someone clicks a wrong button and accidentally uploads a file to an app they didn’t mean to, they have lost control of their data, which I feel is the opposite of what the network should be aiming for. If someone happens to capture that data and republish it before they delete it, then bad luck, but at least if there was a deletion mechanism, there is a chance someone could rectify a potentially life altering, or at least horribly embarrassing mistake. I’m just afraid implementing this will kill the network from ever gaining traction with the average user.

3 Likes

They are relatively censorship free and can store data whether people like it or not. A bully may upload a video, for example.

If we are just talking about apps, they would be unlikely to let people publish public data by accident. They would direct users to private (encrypted) groups, etc.

Actions have consequences. You can’t undo the past when others have witnessed it. I may regret saying or doing something and would like to rewrite history, but in this reality we cannot. If I am caught doing something illegal, erasing it isn’t an option. Public immutable data will be similar; it takes persistence of data a step forward and people will have to respect that. If they do not want to endanger themselves in this way, they should use apps that don’t share their stuff as public data.

I think you are misunderstanding me. I’m not trying to push for censorship or stopping someone from uploading something, but a users control of one’s data. This is essentially what SOLID stands for as well, which is being heavily integrated, so the sudden ideological shift has me confused to say the least.

Apps can certainly easily allow you to publish public data by accident. Uploading a profile picture to a Facebook or Instagram replacement as an example. Select the wrong picture when you aren’t paying attention? I guess your grandma will see that picture of your hairy balls you sent to your doctor for a consult. You seem to wholly underestimate just how nontechnical people can be.

2 Likes

Or they can group together and fork SAFE to make a version that suits their needs.

Facebook and Instagram on SAFENetwork can be apps which don’t write public data, as i described above.

If I accidentally send a picture message of my balls somewhere unintended, I am also sod out of luck.

Indeed they can. God speed to them.

Don’t underestimate the power of ‘illusion’. People want to be in control, or at least have the feeling to be in control. You never pressed that elevator close button?
https://edition.cnn.com/style/article/placebo-buttons-design/index.html

5 Likes

Good point.

Also, publicity is not just that “everyone knows”, it is also about the authority of the platform of that publicity. It is different to have a rumour that I have hairy balls instead of having the photo on my Facebook where I validate with my authority how hairy they exactly are.

1 Like

security through obscurity is not security at all.

And similarly by placing the history of ownership occurring on the APP playing nice is not universal perpetual data as is required by the fundamentals.

No they don’t need to be signed and that is the whole point.

Without web pages having a history of ownership then I can create a very damning (lies of course) set of pages and then change the ownership of them to you.

Now in the pages I write it as if you wrote it using your name etc. Then change the ownership of the ADs to you. Now without history then in 5 years people assume, rightly, that you wrote it. BUT if history of AD ownership is kept then the truth of the matter comes out. And that is what the fundamental is aiming at.

it does not matter if the entity behind the original owner ID is known or not. It just allows the truth be known that it is not your normal ID that did it and so is unlikely you since it hopefully is out of character for you to publish such things

1 Like

I’m not entirely sold on this argument. I would like to avoid lying to users, even if it’s a lie* that’s “comfortable” for them and what they want to hear. Ethical issues aside, misleading them could lead them to misuse the network and make the same blunders they made on the internet. It’s like the message you get when you open Tor: the software isn’t enough, your habits and mentality have to change as well if you want the full benefits that the software provides.

*Maybe “lie” is too strong a word, but in any case, I think we want to avoid users losing trust in the network. E.g.: “You told me I could delete my data, so why is it that insurance company X still has my medical records, and my friends still have my nude selfies?”

4 Likes

Yeah, sounds more like a RISK Network (Regret Infinitely Single Klumsiness) or TRAP (These Repercussions… Ad Perpetuum).

1 Like

It’s not really a lie, though. It’s a guarantee that you have the ability to delete the data you uploaded. Perhaps someone else already grabbed it and will hold onto it in perpetuity. That’s a bummer, but it’s pretty likely they will not reupload that into a public space as I doubt most archivists will want to pay the SAFE Network fees for uploads. There’s still a chance no one grabbed it, and you can just delete it, no harm done. So it is a feeling of comfort that “out of sight out of mind” sort of feeling. It is unlikely the user will ever know if someone grabbed their data before deletion, so for most, they can go about their day with a little less worry. At the worst, they were able to pull it off a public location directly tied to their account, so it may be harder to identify the item with that person.

3 Likes

The data hoarders can just store data locally for pennies and continue to data mine it etc., though obviously to a limited extent compared to the internet.

This argument is not very convincing. I’m far more worried by the accidental-uploads concern you raised above.

2 Likes

I disagree: archive.org “has an annual budget of $10 million, derived from a variety of sources: revenue from its Web crawling services, various partnerships, grants, donations, and the Kahle-Austin Foundation.”

Should history be kept by those who pay to keep it, or should it be kept by default with everyone fully aware of that fact (and behaving in ways that account for that)?

I understand mistakes get made. Maybe that’s a good case for an in-between hot-storage layer, and SAFE is like your cold-storage layer (just spitballing here). I don’t think there’s any good examples for adding a delete function to the only network that offers truly permanent storage. The only network. Permanent.

Bitcoin can be used to buy drugs so we’d better change it… it’s the same bad argument.

Same thing with ownership vs authorship and switching ownership. If everyone knows that it’s trivial to change ownership without their permission then it doesn’t matter. Nobody assumes the owner is the author. That’s just a fact of how it works and anyone making that argument would be immediately discredited.

18 Likes

But it still breaks perpetual data model as presented in the fundaments and the talk David recently had with @fergish

Also researchers years down the track may not be able to know what is attributable to which person because the mob on one side of the story (victors) are writing articles as if it were the losers and actually rewriting history due to the volume of articles they can write. This is what we have today with books of the wars in the past. We only end up with the victor’s side of the story being written about. And supposedly the losers saying similar things reinforcing the falsehoods.

If the original author of these articles can be establish (part of perpetual data) then the truth is easy enough to establish and trustworthiness of the articles established.

And this does not prevent publishing anon since the author can use a throwaway ID.

1 Like