This is a useful and thoughtful article on how to do content moderation in a decentralised environment. (It is by the man who designed the CRDT Tree algorithm implemented by @danda, which will be the basis of a Safe filesystem).
It’s a five minute read, covering the problem, different approaches and some of their main properties, concluding with some ideas and suggestions for what good moderation should aim for and how it might be achieved. Useful stuff.
He is also interested in any ideas or examples that people know about stop he can learn more.
In decentralised social media, I believe that ultimately it should be the users themselves who decide what is acceptable or not. This will have to be through some human process of debate and deliberation, although technical tools and some degree of automation may be able to support the process and make it more efficient. Rather than simplistic censorship resistance, or giving administrators dictatorial powers, we should work towards ethical principles, democratic control, and accountability.
I agree with this, and I think it reinforces what we’ve been talking about here. People can post anything, but the platform/community needs to also provide tools for self-filtering. These can take the form of ratings based on various criteria, and also of groups/circles/web-of-trust.
But the platform/tech itself must remain neutral.
clarification: I was mainly agreeing with the first sentence of the quote. I do not agree with Dr. Kleppmann’s notions of democratic control over moderation. I have had my posts removed on various forum by “the majority”, and I think that is a backwards step that only enrages and divides people… ie, it is a passive-aggressive form of violence that encourages more hate in the world rather than peace, love, harmony. Personalizable filters does not do that. It is a different model entirely that treats the individual as responsible and sovereign.