Storj Beta Update

Each shard of encrypted with a unique key. So if you wanted to make a decentralized cats website, you would publicly post the locations and decryption keys for each of the shards. Of course the application would take care of this all behind the scenes, and you would just have a website that you could scroll though cat pictures.

Now if the farmer hosting that data find out they are hosting public data, I don’t really see that as much of an issue. I’m assuming you are pointing toward an edge case in the fictional country of Dogtopia, where cat pictures are illegal. The farmer would be covered under SAFE harbor like provisions in more civilized countries, and only storing an encrypted piece of an “illegal” file could question if you are storing the file at all.

This is where greylist comes into play. Curators create a greylist with all the ids for the publicly available cat pictures. The farmer that lives in Dogtopia they can opt-in to the greylist, and it will automatically remove those shards. Obviously anyone outside of Dogtopia would not use that greylist.

tldr; Private data stays private. For public data the users get to choose according to their own morals and laws.

Additional points:

  • To prevent abuse greylists are open and forkable.
  • Just because content is on a greylist doesn’t mean it gets removed from the network (that would be censorship). Contract would be renegotiated with another farmer.
1 Like

Indeed it was. Many like minded people trying to decentralize the internet.

3 Likes

The example trivializes the issue.

The USA is not a civilized jurisdiction, by many measures, nor the UK or plenty of other places.

All governments will use other greylists, made as granular and arbitrary as they see fit, and those governments will hire the best consultants* to make sure that the greylists are obeyed, whether they are about tax evasion, or firearm ownership, or politically proscribed speech, or lack of a licence for a myriad of things, and on and on. A file-sharing system that includes such greylisting will not advance freedom one iota in the long run (only until the greylist managers catch up).

Indeed, the centralized API suggests that it will be not unlike Google handing over whatever data the police demand. The bureaucrats will demand the implementation of whatever some judge or bureaucrat deems appropriate

Absolute deniability is needed, or it’s just another trough for the state swine and private-busybodies to slurp at.

* Lucrative work for someone… who better than the Storj people themselves?

1 Like

You give our oppressors too much credit. What you described works really well on traditional centralized systems. But in a decentralized system with thousand of unique hosts, its near impossible to censor.

Greylists exist to give power and protect the user. If EVERYBODY knows that is an ISIS video then users should be able to remove it from their drive. It can’t be used to censor data by governments because of one simple fact. You have to identify the content to remove it. I can upload that content and the system can recover deleted shards faster than they can be removed via greylists.

Not if enough users are forced to apply the same greylists.

You say that it is it protecting the user, which is a tacit admission that he needs such protection because there is a chance of discovery by a third party such as the government. If the penalties for harboring forbidden content are great enough (for example, being jailed because of child porn images that happen to be in the storage area, that the user had failed to greylist out) then of course everyone will have to have the greylists, most conveniently automatically downloaded from cyber-nanny site licensed by the state, but why would it stop with child porn or “Isis”?

1 Like

Unless open source projects are accepting pull requests from government agents automatic greylists are not happening. Its because its open source and not centrally run your case can’t happen.

Being open source has nothing to do with it. You said yourself the greylist is to protect the user. From what? The threat of draconian penalties. There is nothing to stop the widening of scope for such penalties, and automatic greylists is an obvious service that will allow the average user to comply.

1 Like

These type of provisions really are aimed at the service provider level. Not the customer level which we mere mortals live on. So an ISP or Hosting company may have the protection of safe harbour type laws but not those at home.

2 Likes

I think we had this discussions many times on the forum. Some people have a different stance. As a matter of fact you took @super3 post out of context because he did not say it was just about protecting the user but to “give power AND protect the user”. Apparently to you protecting has to do with penalties by a third party, in fact it is about the user himself. If I knew I stored content that I despise on my ressources I would definitely not want to give access to my ressources. It is my individual decision to whom I give my ressources.

Of course, if it´s private data I wouldn´t know anyway - that is part of the deal. However, it makes a (big) difference whether data is private or public. As soon as data becomes public it gains a social and political dimension - “storing” then leans towards “supporting”. It is not only fair that in this case users have a say whether they want to support content - it is also a question of network stability: if on the SAFEnetwork people can retrieve whether their vault stores a chunk of some particular public data - as was recently debated in this thread - they can also adopt greylists that will automatically restart a vault if chunks of listed public content was stored. This means that the network will experience repeated resets if people adopt greylists. It will probably cause the greylister a lot of Safecoin, but they´d probably not bother - I wouldn´t for sure - because not everyone is in for the money.

1 Like

So we’re back to:

https://forum.autonomi.community/t/what-if-i-dont-want-to-store-child-porn/4306/2?u=smacz

That´s what I thought as long as it is possible for vaults to identify chunks of public data. Some see that as a problem, others don´t.

2 Likes

It is not really censoring - it´s regulating the use of your own ressources. Rejected chunks are propagated to other vaults - you can call it a vote on content. Data is only “censored” thoughout the network if all vault reject it. And still then the file could perfectly exist on the network as private data.

Instead of using “censorship” as a derogatory label, we should consider why people are opposed to it. In the end censorship is nothing else but determining domiciliary rights with regard to content. However traditionally these “domicilary rights” were executed by central entities at the behest of others (a certain [assumed] political body). I consider this as the problem and not your right to have say about what public content your personal ressources should be used for.

1 Like

I really don´t see your problem. There IS a counter force. If your vault rejects certain content, other vaults will host it. This is NOT about majority rule because even a small minority of vaults would ensure your public(!) data to remain on the net. So, no poof. I think it´s fair if your public data gets off the net if anyone wants to host it.

Btw. if you really think that “People are programmed to reject unpopular beliefs” you should apply that assumption on yourself as well…

Anyway, this is probably getting Off-Topic, so I´ll leave it here…

1 Like

My point was simply that if you can know what “public”* chunks are being stored in your vault, then the path is open to make it mandatory that you exclude whatever is stipulated by the state. And the easier that becomes, the more arbitrary such mandatory exclusions become.

* …which can be anything accessible to more than one person, where one of the people is an agent or snitch.

EDIT: In practice it’s fine if Storj has that soft standard, but SAFE should aim for the more stringent standard of complete deniability. We are going to have market specialization anyway, so might as well acknowledge it.

5 Likes

How is that a soft standard? This is a mere question of political ideology and strategy. Of course, you can make up hypothetical scenarios in which people are enforced to subscribe to a certain black/greylist, but that doesn´t take into account what this enforcement would mean for political discourse. I could also argue that states will ban distributed storage in general and enforce ISP to block sort of SAFE related traffic. It really doesn´t help imho. Authoritarian states will always find a way to lock people (or rather: the mainstream) in so they can´t access certain content and/or ressources, but such an intervention would need to have legal character and therefore (in comparison to the encroachments of certain security agencies) visible to every citizen. Of course, if you have a conception of the human being where people are programmed and act without free will (i.e. like @Pierce) I can see how it makes sense to be sceptical, but to me that is some kind of double-standard (“everyone is manipulated, but me”)

Honestly, I wonder how that path to make filters mandatory would look like in execution. I personally see the merit of allowing users to take stance on public content, not only because I think it´s fair to have a say on what public content is stored on your own computer, but because it allows to deal with the argument that SAFE is going to be a harddisk filled with disgusting stuff. This is where it becomes a strategical question. But sure, different people, different opinion. And of course, since the final product isn´t there, we have a lot of time to discuss :wink:

I already stated it succinctly in my previous comment. The rest of your comment goes off on a irrelevant (in so far as the mechanism rather than a motivation for being concerned) tangent about politics.

If you have such a choice then that is one step, a slippery slope, from being forced to make that choice a particular way.

1 Like