A little progress was being made on this from before the closure of preceding thread. I strongly believe in SAFE’s current position on the matter. Though there are other ways to curate content for new users with implementing network wide censorship. I agree public perception could be negatively skewed if one were to jump on SAFE and be bombarded with illegal pornography. The previous idea was to have a recommended list of apps presented to new users that are known to filter their content and a pop up message that informs the user when they are using an app that is not known to censor it content/results. This makes it clear to users that while illegal material can potentially be stored on the network, one must actively seek it to find it. That provision could be useful in future arguments about SAFE’s legitimacy. I call on all with ideas relating to this subject matter to, with laser focus, create and refine an idea for content curation while still maintaining the freedom of SAFE’s protocol. The use of this tool/solution should at all times IMO, remain optional to the user. With that, I pass it to you…
The official app could have something like that.
And there was a (unnecessarily long) discussion about that (see below).
Edit: I don’t want to be too picky, but I would have named the topic “Ideas for curation of legal content”. There is no real requirement for the curation of “illegal” porn (or any other porn, or other “illegal” content).
I assume you are trying to establish how to curate “legal” (according to laws valid in Scotland) content rather than help pedophiles to effectively curate their multimedia libraries and expand their outreach activities?
I wrote unnecessarily long because just like the other one, people like to keep discussing non-issues (what I mentioned in comments under both of those topics).
- as if the Foundation could somehow every put in the app anything else but links to “legal” (in Scotland) content.
- as if the Foundation could somehow censor any content (theoretically they could, but that would cause instant collapse of the network)
So we already know that the official apps, base code, endorsements, recommendations, blah, blah, blah will never contain anything “illegal”, and that it’s very unlikely that any content can be censored without causing instant collapse of the network.
But for what it’s worth, here’s that other topic.
I agree, we are not going to find ways to curate “catalogue” illegal material. At least not those who are not in favor of it.
And if you are going to talk of the actual intent and that is point people to valid content and leave the bad stuff to be hard to find then this topic is of no use.
Easy - have apps that do search engine style of stuff and curate it for good stuff and open a thread with that topic title. Anyhow good luck curating for only good stuff, big job, or else automate it and see people complain because it censors sex education, serious conditions.
Otherwise this topic will become a centre for those who feel that some bad content will destroy SAFE. Well news for them SAFE is a REAL NETWORK, not some social network trying to out do scambook
Another attempt to clarify the situation:
- It seems to me the optimal solution for the Foundation is not to create the official directory (cost, risks, etc.)
- That leaves “the community” as the only realistic choice for planned curation (for other possible choices of directory maintainers, see my comment here).
How to do it?
- Community members could start a project
- The Foundation could potentially include the project in its communications (Web, s/w, extensions, etc.)
Who’s going to pay for that?
- The Foundation could (I think there’s a fraction of proceeds that could be channeled to such effort). There may be legal risks in case “inappropriate shit” makes it to the directory.
- The community project could finance itself from earned revenue (generated by visitors, if the idea that “public” content will be rewarded) and donations. This community project could reward the submitters of good and correct entries based on its own rules. The same approach could be used to create a blacklist (or several) for self-censoring.
- If the Foundation sponsors or rewards this particular community directory, there will be questions why this, and not some other community directory. I would be in favor of the Foundation listing and paying for none, to make things simple.
- Risks for the maintainers. I register a cute site called
cartoons.safenet(let’s ignore the exact format of SAFE domain names for now) and after it makes it to the directory, I change the content. I think it can be safely stated that the complexity of the job probably means that this project cannot be profitable in the short to medium term, and that stated risks mean that only anonymous curators should try to contribute in the short term.
My own conclusions (feel free to share yours):
- in 2016 “main” directories (in the sense used by @LuckyBit in the topic I linked above) will be started and maintained by anonymous users (individuals, organizations, businesses).
- in 2016 there will be no official or official community directories (maybe there will be “highlighted projects” or some short, watered down lists of most interesting “legal” sites, but that’s a far cry from “main directory”).
- The only realistic choice for non-anonymous community members is a donations- or traffic revenue-sponsored content (okay, let’s not lie - also ad-sponsored, @Warren) blacklist directory, where there is no major legal risk for the contributors if one misses 2-3 % of “illegal” sites and people call the black list ineffective.
@Warren, this is your chance to get creative - how can these teams of selfless volunteers (“community organizers”) avoid falling in the trap of creating a sponsorship-based “community” directory (and ultimately a sponsorship-based SAFE Network directory startup
Safoo!, once they figure out they’re sitting on a goldmine)? Is there a way to put in a practically unlimited amount of work in the creation of a free directory?
I would say instead just have some sort of agent which the user programs with their own preferences. It should be user preferences which are the guide and not “legal” and “illegal” because there are no global laws and people in different countries might have different circumstances. People in the middle of a civil war might have a legitimate reason to be involved with illegal activities and might want to set their preferences accordingly.
What we don’t want is for people to not have the ability to set their preferences at all, or to have a SAFE Network where external preferences from people who aren’t users somehow determines the curation within SAFE Network. As long as SAFE Network respects users individual preferences, as well as users aggregate preferences, then it can work.
Humans can join networks where humans curate collaboratively. The point is you need to aggregate by preferences so that humans who have certain preferences aren’t grouped up with their polar opposites.
Collaborative filtering works really well, as does reviews, ratings, scoring, etc. If you look at shopping lists then people who don’t want high sodium could be in the low sodium group and the algorithm could pair them together based on that preference.
I started a topic on this subject some months ago. I think some Devs will come up with an App like the one mentioned in the topic. Next to that I think a lot of sites/Apps on SAFE will use some sort of moderation. Building a decentralized YouTube? You might wanna give people the opportunity to flag videos. When really sick/illegal stuff get’s deleted over and over, there’s no need for sicko’s to try to upload it again. I think that way one can build a version of YouTube that’s as good as the real thing. Or even better.
- I presume you mean delisted, not deleted?
- There won’t be “really” sick or illegal stuff, as @LuckyBit said. Tastes and laws differ. There will only be censored and non-censored directories.
- The sickos can upload and register (cartoon1.safenet, cartoon2.safenet, etc.) automatically. It can be scripted just fine. And if the uploading of “public” content is free (I hope it won’t be, but last time I checked that was still the plan), there’s nothing to stop them from uploading and registering their sites under new random names.
High-level ideas are well understood. Most people get it.
What I would really like to see is someone actually explain how it could really work and address the real questions (such as, how can a non-anonymous person run a site or directory which (if not properly maintained) contains content illegal in his country or place of residence.)
Even in your own topic (specifically, here, a comment which you Like’d, by the way) there are ideas (not even proposals) that contain a number of weaknesses, which makes them impractical and unworkable. For those too lazy to follow this link, a commenter said that these listing apps could have their own reputation/censorship mechanism (so, it’s pretty sure they won’t happen in 2016).
Also in that same comment, there’s this: “I think maintaining such lists as a community is possible, like Wikipedia is maintained.” (your response: “Great idea! Really like that one.”)
I mentioned that under the legal risk for moderators (my second comment, above). In particular:
- How can a “community list” be maintained? How do you let people “post” stuff to your MaidSafe site?
- If you pull content from other people’s sites and share with others, then you’re responsible for the content, which means legal and financial risk for you and the co-owners.
- Bottom line: it’s a full time job, a big investment in time and potentially you can end up seeding links to most illegal videos out there to the entire SAFE user population.
In 2016 the most practical approach will be anonymous, revenue-seeking white-list sites. And if public content is free, the subsidies (paid by the network) will flow into the pocket of advertisers (surprise!).
Got it. Thanks for the catch!
There’s a video-App on the SAFE Launcher. I would like to “upload” a video to that App to share with others, one that’s already in my directory (data-atlas to Chunks etc). The App requests a screenshot which I’ll upload (share data-atlas with the App). Now, I select the file to share with the App. The App is running locally on my computer, so the App opens the file, flips the first bit so we have a whole new file, with a new data-atlas and new Chunks etc. The ownership of this (new) file isn’t in my hands alone now, structured data makes it possible that different parties own the same piece of data. The App will add a link to the file and screenshot on it’s GUI (say HTML/JAVA etc.) Someone else opens the App, sees my video at the top. There’s a Flag button which actually sends a PM to the other owner of the file. And when he decides so, he can remove the link to the file, or even the file completely. Because he has ownership. This could be done in a way that’s not very different from what we have here on the forum.
Same for a safe:videowebsite. That site could allow you to upload anything. Where to upload to? What about a button in HTML/CSS/JAVA that sends the info you provide (tile>myvideo.avi address “Hash of the file”) as a PM to the owner of the site. He’s able to review your upload and can decide if he adds a link on his page or not. It could even be made in a way that different categories have different moderators etc.
That’s good, details on how it might work.
- Ownership: is the file under shared ownership? If yes, that makes the moderator who accepts it responsible for possible legal issues (including copyright violations). If he’s anonymous, it doesn’t matter (and fits my expectations of private directories, etc.)
- Cost: you make a copy of the file. Who pays for it?
- Revenue: if your app generates revenue, who earns it?
- Scale-ability: you take the whitelisting approach. If there’s just one directory (“the” community directory), moderators (the guys who approve videos) can’t possibly actually view them, so they have to poke around and randomly forward to see if the video doesn’t contain something that will cause him trouble (either legally, or in terms of bad feedback; the latter is going to be a constant problem and a source of major workload, similar to people constantly bitching about content of Wiki articles, but much worse). Even a medium size site would have to have dozens if not hundreds of moderators, which requires a sophisticated reputation system and complex worklflows (such as dealing with complaints from SJWs and the like).
As a simple scenario, let’s assume a single volunteer admin can review about 100 videos a day (2 hours of volunteer work). Let’s also assume there’s 1,000 video uploaders with 10 vids a day (which is very modest). That would require 100 volunteers and probably another 10 to deal with complaints of all sorts.
Do you think such a non-profit moderation system can deal with above issues and appear in 2016?
I don’t really understand why you’re talking about 2016? Wanna fix all problems at once when SAFE goes live? And what about the way this forum is moderated? Mods can go to sleep until there’s a flag, that’s the way a videosite could work as well. YouTube uses this approach.
My guess is that it wouldn’t cost that much to upload a video.
Wanna check this one out, wasn’t really clear to me as well. Still has to be decided exactly I guess.
Maybe it shouldn’t be free. There should be a listing fee. Another idea is to make an attention economy like me and @Warren discussed last year.
That would work by letting people auction their attention, putting the users in control instead of advertisers. I don’t think we want or need anonymous advertiser in control. If it costs money to have your public content reach people then you could use something like how Syenreo works so that content has to either be voted up or paid to reach people.
If a listing isn’t free, maybe you can have quality control and make it spam resistant. People who have to pay to have their content added to a directory wont add anything they think will get delisted or otherwise voted off the island.
Another person has a good idea also, of letting each individual have their own private list, and then letting people vote on public lists. This would be an approach similar to bookmarks where everyone has private bookmarks which if they make it public then anyone can subscribe to their bookmarks.
I am trying to establish whether you agree with me that initially no community driven whitelisting app will be available and that this opens space for private anonymous directory providers.
Thanks, that page has a wealth of good info. It appears that (according to design/plan at the time, May 2015), you could submit resource (link, metadata, etc.) information to directory service in form of a message, so that part would work.
One interesting answer from that link is that the publisher would make revenue only if the content is accessed from his app. I wonder if that is how it’s going to be in v1.0. If that were the case, there could be the following scenarios:
- Public (free) content: publishers publish resource information in directories
- Private (paid) content: publishers promote their application rather than content (because to publish direct links to private content is like helping people to leech your content). Another possibility is to post content with ad overlays that invite people to download your XYZ app for a full version of your content.
But if it’s possible to post links to private content that can be shown (played back, displayed) by a public site, what prevents users to use the same app to expose all your content on the same directory site?
Too centralized. Everything has to be decentralized.
We should avoid ads. Let people mine something, or go with the attention economy. Advertiser based economy ultimately is coercive and puts power to control attention away from users and into centralized third parties?
Ads should be the last resort and if they do exist it should still be the users pocketing the money in exchange for viewing the ads otherwise you empower the SAFE Network version of Facebook.
We can’t decide whether directory apps put ads in their apps or not. We can use apps that have no ads (or non-intrusive, etc.) apps, or maybe try to block ads. But it’s almost certain that some form of ads will be present (if nothing else, preferential placement; don’t forget, unlike in the real world, on SAFE network there’s no obligation to “indicate” that some product or service placement is paid, so it will be difficult to tell whether you’re seeing an ad-free page.)
Ads will be the first resort of many apps (the proverbial low hanging fruit), but some revenue-sharing models might be possible. If site stats can be audited, then users could get their part for clicking on stuff, etc.
Sorry if this is redundant to parts of the thread above. On could switch off layers of search accuracy letting in the porn and even the ultimately disincentivized ads. It could be as easy to do as switching off traction control. I guess for me the entire thing is search that leads us back to ourselves and localization but its all search not really storage so the traction control analogy is quite accurate.
Accurate search seems to be the key. Seems like it would be based on some sort of statistically very hard to defeat up voting. It acts as a spam filter and a search accuracy enhancer. Ads are top down and a coercion of attention and really an a supply side demand creation attempt. There is no supply side here and no need to think in terms of content curation, just a need to empower the end user. Theft or coercion of attention should result in filtration and a charge as it really damages the network for its only real constituency the end user. We are all end users. Beyond that its word of mouth and giving away product or speaking to people who’ve flagged they’re willing to view product info for pay by the second but that would be opt in. There is no way to cut in front of the end user’s line for their own attention and no way to bribe it. The result of accurate search and the ceasing of the taking of attention hostages over access games is much better trading relationships and better quality products that aren’t compromised by ad budgets and manipulations. No more marketectures, no more puffing or hyping. Advertising has nothing to do with honest useful search, they are opposites.
@janitor and @luckybit I simply think that despite the current power of Google that accurate search and lists can be done and in a privacy anonymity context with honest up voting absolutely vital honest search wins out. I don’t think there is “content,” just communication and we need it to be as noise free a medium as possible. And in an open access environment such lists are ad free as the lists themselves upvote and people understand more and more that ad based stuff is not unbiased, or accurate or necessary or even a ethical use of time because it tends to be coercive where there is no need for coercive attacks on attention and time and no need for the interruption cost, no need for wealthy pan handling. Ads means no level playing field and it means often dirty incumbents dominate, its a competition reducer and a reducer of quality and market utility and a centralizer and de-localizer. Let us not forget that Google has transitioned into sponsored search where the highest bidder optimizes the search, that is the predictable result of an ad sponsorship based endless slippery slope into ever increasing uselessness. After a while Google won’t be “search” it will be “lost.”
The problem of getting products notices or pushing is not actually a problem, its a non issue relative to the prohibitive cost of that type of activity. Quality will rise to the top and that is good enough. We don’t need the impulse buy part of the economy as that is a road to impoverishment and powerlessness as it trains household budget killing poor purchase decisions and begs for debt and an economic downward spiral, its also a wage killer because it has people begging for work against their health.
Yeah, seems clear to me, I think it’s all possible on SAFE, a forum like this one, a personal blog that only you and your friends can see, and even complete dark websites only invited people have access to. At the same time, one could build an safesite with no moderation at all, just like some sort of pastebin.
I think we’ll have it all, like I said above. You want a completely decentralized forum where everybody can post anything, nothing is moderated at all? I think someone will provide a place like that on SAFE. A safe:videodump where everybody can put all the videos they like without any moderation? I guess it will be there, I rather prefer a YouTube-like website where a group of volunteers with a high level of trust are allowed to remove flagged content, if needed.
I’ll just comment on this part. It is very hard to tell when something is advertising (as I mentioned above, on SAFE Network it doesn’t have to be declared, and search results can be “tuned” to promote advertisers).
Honest voting can be hoped for and encouraged, but another issue is how to detect and eliminate dishonest voting (paid up-voting, for example) on SAFE Network.Without such mechanism it’s going to be very difficult to get correct search results.
Another issue with the idea is who will pay for the effort required to staff the search engine.
I would suggest a separate topic (or topics) on issues how to detect advertising and so on.
Again, some people have good intentions - maybe a majority - but how does one curate content when (if) there’s no revenue and when there’s a dishonest group of users?