"What if I don't want to store child porn?"


#1

As I’ve been talking with others about the SAFE net, one consideration has come up for which I can’t adequately counter or provide satisfactory response:

    "So, if files get split up and stored out on shared disk space that users are offering, what if one of those blocks [that gets stored to my drive] is a piece of kiddie porn? I'm not ok with that."

I’m not ok with it either, but I do understand that Secure Access For Everyone does mean for EVERYONE, even those with ill intent. I also understand the breadth of content being stored to the network makes the odds of that circumstance occurring to be extremely low.

So how have others crossed this discussion point? My own answer is to outweigh that small chance with the tremendous potential to the whole of humanity (even if that means some temporary disruptions to concepts of power, governance, and commerce – which I think will be a pretty good thing).


Liability Concerns
Legality issues of storing someone else's files on your computer
What about the child molesters?
SAFE Network and social responsability
Thoughts on the dangers of undeletable data?
SAFE Network and social responsability
Compare and Contrast: SAFEnet and Tor
TOS (Terms Of Service) Regarding Child Porn for the SAFE Network
New Members: Start Here!
#2

Seen the question before on Reddit:

" will Maidsafe just become a haven for all things nasty & nefarious, that currently has a hard time hiding on the current internet? "

David’s answer:

We get this a lot. The real bad folks as opposed to the mistakenly inquisitive etc. are on the internet at the moment using a myriad of very hard to setup security and crypto. What MaidSafe does is makes everyone on the planet on the same playing field in terms of security.
People who are evil etc. will still need to be tracked down by other means as they are now.
Don’t need forgiveness for the question, it’s exhausting for us as we answer it an awful lot, but undeniably an important question, so no prob. Watch this though we will now get the completely way off track accusations on this thread now calling us all sorts of evil people because we allow A or B or C, it is tiring and those people are obsessed and way on a far side of the debate. Its like politics or religion I think :slight_smile:


How can SAFE prevent Human Trafficking
Storj Beta Update
#3

I do understand it’s been discussed a lot already in the forums (in regards to how to handle the presence of seedy content on the network). But the case I have in front of me is: how does one impress the benefits of the network overall to someone who has a focus on a small idea about one aspect that is disliked?

I’m not asking to resolve, “how do we keep such content off the network? (Impractical)” Or, “won’t this network be a haven for ilk? (Unreasonable)” But rather, “how do we frame the discussion of this situation (stuff stored on my hard drive, not a server, might be content I disagree with) for those considering it within an existing-world model? (A personal concern about participating)”


#4

I like to compare it to other infrastucture that can’t descriminate on who’s using it like the roads or the water supply. Your taxes pay for these and yet they are used by people who kidnap, steal, kill, harass, etc.

Now if the road you pay for is exclusively used for these activity you wouldn’t want to pay for it, but since these activities are completely dwarfed by all other legit use, you reason that it’s worth it.

It’s the same on Safe. If the network is mostly filled with content I can’t agree with, I’ll stop supporting it. But if, as most everyone here suspect, the network bring much more good than bad, then it’s worth it.


Question about Maidsafe posted on computing.net
#5

Same goes for TOR, Freenet and in the future probably others as well. A lot of groups around the world are working on decentralized solutions on the internet using strong encryption. My hope is that SAFEnet will be so easy to use and that we’ll see websites/Apps like some sort of Youtube, Vimeo, Reddit etc. Especially when people start to use it like a Dropbox, changes are that the amount of sick content will be extremely low on a percentage base because 99.9999999999% of the people are using it for just normal and good purpose. Next to storing there’s another point to think about as well. Your node will also route data, not only store is. I think we will see Apps that block sick content. I also think that big video Apps will have moderators which delete flagged content. If we have SAFEtube, and all bad stuff get’s deleted (and poster blocked) there’s no fun for them to try to post bad stuff again. Here’s a read about an App that creates a block-list:


#6

SafeNet is going to become a record of all that is human - both good and evil. For a long time humanity has worked, in vane, to eradicate evil by destroying the evidence of evil’s existence. This will no longer be possible in the future. Instead we will need to address the source of the evil and stop wasting our time trying to hide it from ourselves. This is going to be hard. It’s a really tough thing to come to terms with and a lot of people aren’t going to like it, but that is the reality we now face and I for one think it’s a good thing.


#7

Ask them this: Should a car salesman not sell cars because they might be used by the new owner to run someone over, or as a flight vehicle after a robbery? Or for a car bombing? Is the car salesman morally responsible for any such actions?

The answer is no, because 1) cars have legitimate use cases, 2) the car salesman doesn’t have control over the usage of a sold car, and 3) the car salesman couldn’t have known the new owner would use the car in that way.

Replace “car” with “secure data” and you’ll see the same holds true for SAFE farming. You’re neither morally nor legally responsible for the contents of encrypted chunks of data on your vault. Remember that all current cloud storage providers are in the same situation, anyone can upload all kind of sick shit on their servers in encrypted format.


#8

@DavidMtl exactly, that would be my answer also. The driver behind me might beat up his wife, the driver in front of me might be drunk and not carrying a drivers license. Do I oppose the road they are using? Or do I oppose the people building and maintaining that road? Of course not.


#9

Technology is neutral, people are good/evil (in very simplistic terms) and the points raised here are things we will need to overcome, this is why we need a good crisis PR strategy in place. I was thinking about these things a few months back and wrote this blog post on the topic.


#10

Another comparable argument is going on around encryption. David Cameron says the government should be able to read everything, some will agree, those who don’t are likely to get SAFE. Those who prefer to trust authority enough to allow them complete control are less likely to see the benefits of SAFE.


#11

Here is how this will work: Hopefully open source so we can verify we aren’t ratting out people we don’t want to.

The fuzz will do their due diligence to capture some scumbag, and coerce them to give up their safe login (willingly, malware, keylogger whatever). Then they will have hash info for the chunks of the child porn.

There will be subscription lists for all the bad things that exist: child porn, copyright infringement, religious propaganda, anti-religous sentiment, x-rated video, anti-government content

Either the capability will be built into maid client, or else patches to the source will be available: Altering the “close group” code which can recognize who is requesting and or supplying these chunks and can report their IP to whomever made the subscription. Or just black list content and choosing not to store these chunks

The good is that we can choose not to serve child porn that is known to exist. Although until it turns up on a watch list it is still going to happen underneath us, but hey we are actually trying.

The bad is a pretty big bad for the whole network: Anyone not adhering to the government run subscription lists (including people just not knowing it exists) might be the target of raids or extra surveillance because they are serving known child porn.

Another bad is the moral ambiguity implications: Sure I would rat out IPs of people requesting child porn, but I would not care about copyright infringement. However, just knowing there are watchdog nodes out there reporting people devalues the privacy of the network.


#12

Doing this would effectively make it a different network, so it wouldn’t take off.


#13

It seems you don’t understand how the network works. There are no whole files, there are only tiny bits of encrypted data split up over the entire globe. Only the person with the key can unencrypt and put the pieces all together again - in effect only the one who has the key owns the data - everyone else is just holding meaningless garbage in exchange for a fee.


#14

I think he knows that, but if the childporn is public (or private while an account having the datamap has been compromised), it is known which chunks are part of the childporn file. In combination with his scenario of compromising IP obfuscation in the code, it’d be possible to log the IP addresses of those hosting the childporn chunks. I don’t think this will happen, but that’s his proposed scenario.


#15

With homomorphic encryption and other mechanisms it might someday be possible to store data and have AI exclude or filter. AI is smart enough now to probably identify what child porn is once it’s trained.

This problem has been discussed on the forum before. Most people don’t want to store a lot of stuff worse than child porn, like terrorist plans or anything which could cause loss of life. It’s just the technology we work with is a bit limited at the moment in it’s ability to provide both privacy and security.

Enigma on the other hand is promising. I don’t know if I believe the claims yet but if it does what it says then it’s the holy grail of security and privacy.

Maybe @dirvine Can take some time to study Enigma.

http://enigma.media.mit.edu/

The whitepaper claims we can do data analysis on encrypted data. That would mean an AI could filter out child porn without knowing who is storing it or decrypting it?


#16

It doesnt make it a different network, it is only passively looking at who requests chunks, its nothing like a blockchain fork or something that changes how data is stored.

I was pretty sure i understand how the network works. If there is “deduplication” then anyone who knows what a file looks like unencrypted, can tell which hash chunks belong to it


#17

I think people need to come to terms with what evil is. It isn’t the network, nor bit of data … but nor is it is a file - even child porn pics - these are not evil. What is evil is abuse of power - so when someone does something evil like this to a child - that is the evil, the pictures are just evidence of it. we do not make the world a better place by trying to eradicate the evidence, we make it a better place by stopping the abuse from happening in the first place. People need to come to terms with this, because the Safe network isn’t going to discriminate based on data - sure we can choose to use different search and categorization methods to ‘not see it’ (I’ve written about that on this forum before), but whether you choose to see the evidence of evil or not - it will be there. So stop trying to find ways to delete it - you’re putting good energy to waste. Instead seek ways to educate people to understand what evil is and how to stop it – i.e. empathy for others and generally how to identify sociopathy and get treatment for sociopaths. Humanity will evolve and the sooner we can focus our efforts toward that end, the less pain there will be for everyone.


#18

Makes me think, the simple fact that someone might be recording the get request you make, even though it’s statistically improbable, might make people very self-conscious about accessing publicly available “wrong” content. Of course they can use a VPN to hide their IPs, but that’s a big extra step and they can do it now anyway.

I can’t see how the network could do anything about it and not convince it should try to either.


#19

The network exists in such a manner that no usable data exists anywhere except for on the client machine, once the files are retrieved and decrypted. You are not storing “child porn” Because the 1-s and 0-s stored on your hard drives are not child porn. They are just random ones and zeros that have zero meaning without 2 other files that you are not storing and have no access to.

We need to look at the MAID system as infrastructure, more than storage. Files never exist except for on clients machines. MAID just hold the data required to re-create the files.


#20

Sorry, I misunderstood you at first. I thought you had in mind the adaption of the communication protocol to include IP address of end points or something like that. But you mean client managers monitoring and matching GET requests of the client, right?

Have extra nodes (hops) sit between a client and the client managers, onion routing style. The personas of these nodes would just pass along encrypted messages from the client that can only be decrypted by the client managers (encrypted using public keys of client managers). This would be a layer dedicated solely to IP obfuscation. Usage of this layer could be optional for the client (since it adds latency).

I think it should do something about this, because the chilling effect you describe may also affect those accessing politically sensitive data. If China can manage to have two million employees actively censoring the internet, they may also manage to infiltrate the network monitoring particular GETs by a percentage that is just scary enough to make you paranoid.