Dealing with horrific content or something

Yes, I think that is what we all need (inc me), much more info.

It cannot be one body but a consensus of many bodies and absolutely not governments deciding.

Thsi is not about identifying any person. It’s only particular files (unless they are encrypted in some way that is not self encryption etc.).

I am sure we all do, it’s a prime reason for the projects existence.

Node operator, i.e. the human running the node. To ban a chunk then all nodes need to agree to do so.

I don’t think those folks will ever be satisfied and they will 100% push for greater control. But we don’t want them to be deciding this stuff in any way.

I agree, it could.

We won’t in any way.

100%

We don’t intend to take a lead from any gov body in terms of what the network does. This is a much wider body of bodies that we trust has identified abhorrent (extreme) data and that is agreed by many separate bodies across the world.

In no way would any of us trust a single gov to do this, or a union of govs (5 eyes etc.) as that would defeat the whole network.

True, I hope we never get there though as single clients would be weaker if they show they can be controlled. It’s all a balance.

3 Likes

That actually does sound ok from the censorship point of view, but creates hell for data integrity checks and other network health checks and malice detection. Not a critics just thinking out loud.

1 Like

Any level on censorship is not acceptable for this project. It puts the whole integrity of the network into jeopardy. The only acceptable way to reduce what people can see is with filtered browsers / apps built on top that only show content that is appropriate for the user.

If you introduce a mechanism for removing certain kinds of data the confidence is lost of the system that the data is SAFE. I thought nodes by design wouldn’t know what kind of data they were storing? Even if that data is public?

If the proposed kind of censorship happens, I expect the network will fail, or it will be forked to provide zero censorship.

I think we can all see how important it was the way bitcoin was invented now. Introduced to the world quietly, without a company or human face representing the project. With the best will in the world a centralised company trying to create a decentralised system will come up against legal challenges that jeopardize the decentralisation of the project.

I hope we find another way to create a censorship, private, autonomous, SAFE network.

If companies, governments, nodes or any humans get to say what gets removed from the network they will find a way to manipulate it to serve their own needs and we end up with a similar system to what we have now.

14 Likes

It’s a very reasonable view. This is what we need to keep looking at.

Consider though governments banning Safe nodes and heavy penalties for running these. It’s all a mess, we just need to provide a Safe network for everyone that protects humans from all types of controls.

We do need to live an understand the world as it is and is changing to as well.

If nodes could censor data (they can) and do so then other nodes may vote them off. So it’s not a simple unilateral decision for a node to do this in any form.

5 Likes

If it is necessary, than MaidSafe will stand behind official crippled SafeCenzoredNetwork while community will use forked SafeNetwork with all other updates MaidSafe do. There should be even reward mechanisn to support MaidSafe and all devs in work same way as was designed.
Than there would be never content you could be sued by anyone.

2 Likes

Everything is on the table here.

But just like China banning bitcoin miners those farmers will relocate to another country if the reward for running a node is good enough.

Also, how would a government be able to locate a node? Won’t this be hidden from them and everyone?

1 Like

To be truly decentralised it needs to have the robust nature that bitcoin provides.

It will come under many attacks, but it has to be able to survive them.

3 Likes

Interesting discussion, but just to throw an additional consideration out there:

Would node operators be willing to look at horrific content all the time? Consensus requires them to, but I’m not sure they would do it. Could you end up either with not many people willing to run nodes over the long term, or with 99% or node operators just automatically accepting recommendations put to them because they don’t want to look?

1 Like

Also the node operators don’t have anonymity the way normal users do, right?

If moderation is part of a node operators job could they be held legally responsible for not censoring content their government deems illegal?

1 Like

This is quite vague. You use the word “won’t” … but I suspect in reality this means “can’t” … If the qualifications for being a node are onerous (in both extra work for the machine and extra work for the node operator to comply) then the network will become highly centralized … and the rich and powerful get richer and more powerful.

I would argue however that by definition such are ‘governing’ bodies … and while they may not be current ruling governments, they would wield considerable political power ( and the ability to shape history and the perception of reality is the most powerful of political tools) and hence could become the defacto government. It’s not what you create, it’s what it may become.

edit: Isn’t all of this going to require a rather large amount of time, effort & capital? I can’t imagine such a system of consensus bodies being in existence pre-launch (beta). Wouldn’t this require additional VC money and time to get all the ducks in a row before launch?

2 Likes

I don’t think nodes are cloaked (by an extra hop) anymore (performance issues). But a node could use it’s own proxies to obfuscate it’s location.

It was stated earlier that they would use pattern matching with hashes of known ‘bad’ content. So node operators wouldn’t be looking at it, but machine would be taking on an extra burden of scanning everything.

1 Like

Ah I missed that sorry. So its not really consensus from the nodes then, they are just implementing a decision which has already been made, its whoever is defining the known bad content who is the moderator. Is that the international bodies that were mentioned? It all depends on whether they can be trusted then?

2 Likes

Once this mechanism is in place then the node operator’s government can pass laws to force further complaince or penalties for non-complaince thus removing the option to not accept list and use it in full. Yep that is exactly what the Australian Government will legislate. And if anyone remembers the crap Senator Conway tried to bring in we’d have dog grooming business’s content blocked, and other stupid things along with the laundry list of government censorship, and if we don’t then up to 10 years jail is the current thinking on computer content crime.

Our government as well as others ignore realistic things, with one Prime Miniter (Who ran an ISP in the past) said that the Law overrides any encryption/network/program operation and laws of Mathematics. (yes I know it was so stupid but they believe it)

Correct and as per above, any system implemented at the Node level will only enable legislation to enforce Node Operators in Australia to be forced to implement their list of CSA material and their political censorship (government leaks - jail time here if have it available ie a node)

Also any list has to be at the file level since chunk level means someone has to have the original material, self encrypt it and make a new list.

Once mechanism is there our government will force it or jail time if discovered you didn’t. May not happen straight away but it will once the legislation is passed

Our government will just legislate the node operator in Australia has to no matter any consensus

At this time no supplier of disks is required to censor data being stored on it. Safe is a MAID and an extension of disk storage. Operating system suppliers are not required to censor data and any services running using the protocols are less than an operating system. I do feel Safe is in between disks and operating systems.

So the solution really needs to be at the application layer, not in the core/protocol layers the way I see it, which also does not destroy the original goal of forever data.

Even could be as simple as the standard uploading does the scan which is basically the application layer that does the self encryption and upload. Thus its not a Node Operator problem and is the user’s problem if they use the alternative upload system in their client.

Remember the list agencies have are hashes of the content (files) and not at the chunk level so chuck level censorship is just the wrong layer to be doing it. Files are at the client/application layer

At the moment the client and Adults know each other’s IP address.

5 Likes

An alternative proposal.

Supposing that Maidsafe and/or the new Foundation in Switzerland, is concerned about being sued down the track for the network’s activities … I’m guessing that is the reason for these proposed consensus bodies …

Then the solution may be that Maidsafe and the Foundation, simply do not launch the network itself.

Instead, bring the code up to a solid pre-launch level, then ‘someone’ can write up a paper on how to launch the network and publish it independently or if possible anonymously.

We the people of the future Safe Network, then launch it ourselves.

Maidsafe then just goes on to work on apps for the network. And maintenance for the network becomes organic and perhaps managed by a DAO.

10 Likes

Agree. Trying to change the world for the better by asking the overlords of the current system for approval is ridiculous. They are the ones who have created the mess we are in and need bypassing. I understand that brings risk to the developers and everyone will have to make their own risk assessment.

That’s because you have never searched for it. Only the people that want to view such things find it, the same way they do on the current internet. If people want to break the law they will find a way. This system benefits far more people in good way (even if they don’t know it) than negative.

I’m devastated we are even contemplating this to appease the

6 Likes

I’ll happily be that someone :+1:t2:

6 Likes

The endpoints are the weakest link and that’s ok. It’s the responsibility of the human using their client/endpoint to deal with it. Just like the onramps and offramps to fiat can be used to address the aml/kyc concern, the endpoints are the only places where the rubber stamp of censorship or mandate can technologically/physically meet the road. Data chunks are just random noise, except when decoded at an endpoint (ie. information onramp/offramp). Abhorrent or any other content can only exist at a client endpoint when it is decoded and intelligible. If this is not the case then there is a serious bug report to file on github.

Good technology has no need for a wider body of bodies either. The whole premise is ridiculous. Sincerely.

7 Likes

Some here are inspired by “nature” and like animal allegories. Let me put it this way:

The only way you can make a hungry crocodile not try to bite you is to make it clearly see that it cannot reach you.

If Maidsafe has lost its nerve, I completely understand. This was always a dangerous project. I wouldn’t go up against Los Zetas or any such organization. I fear and dislike them but, in a way, I do respect them. At least they’re honest about what they do, just like crocodiles.

This, by the way, is a very good series.

1 Like