SAFESearch - Search Engine

Those are both one time costs.

Right now there is no cost associated with spamming amazon reviews for example. So this would be some kind of every review cost. This could decrease over time for normal users. It’s kind of a ‘proof of work’ type thing, to prevent automated bot submissions etc.

I am not sure there is a spam problem to solve here. Search engine bots crawl over everything they can find. I suspect this is because the search engine would rather have as much data as possible; it is in their interest.

Maybe you could pay for an increased crawl rate, but that would start to shape the quality of data towards deeper pockets. This may put people off using the search due to bias.

Wouldn’t a traditional crawl bot be decent enough on safe net anyway? Maybe there is a better way with the new technology available, but the current approaches would seem fairly transferable.

Edit: saying that, reviews of links/sites by users could be good. These could have a cost and would improve ranking potentially. Too much to spam, too little to worry about for genuine reviews.

1 Like

Hmm. Yeh a crawler is definitely useful. But refining indexing and so forth is a big task. One that requires money and/or a lot of time / infrastructure (and in the end, I guess: ads / google / black box company type shenanigans).

To clarify my thinking regarding ‘spam’ here:

Right now there is a problem with marketing content, vs content creation. Or: SEO sites with not so useful copy. Or stolen copy. Scraped. Etc.

This ‘spam’ ideally would be discouraged in a system that uses user reviews of domains… (stack overflow style perhaps).

It’s a problem as noted over in the PtP threads re: piracy etc. How do we find the original creator? Who does that and why would they bother vetting these sites.

Some system of user approval, backed by a crawler as you note, might be able to achieve this… And this could ideally be reflected in search rankings.


Upfront cost for submission/rating discourages automated gaming of the system (would have to, or it’s useless).

This cost could be rerouted to the community. A split to devs/ users. If you’re amazing at rating / categorizing etc, you’d get a bigger share. If that’s reinforced by other users, then you get a bigger share (reputation system). Ideally over and above any submission cost.

The search would be weighted accordingly. Ad free. Cost free for visitors (outside of GET rewards, also routed to the community pool).

This would be manual labour. But that’s kind of the joy of it. Too much of the internet today is low quality / SEO / clickbait crap (IMO). In order to get views in order to get ads. And the ‘quality’ of that is determined by google.

This sort of setup could provide relevant, interesting content. And there’s no central unaccountable, profiteering corporation controlling what gets to the top (with open sourcing all the things).


I’d love to be involved in that sort of setup of a SAFE search engine. I’m really for something ad-free and community driven.

2 Likes

I like that idea! Not really possible without micro payments, as it would get either spammed (with none) or not used with high fees.

Having people give context and reviews of sites would be very useful for searchers!

1 Like

Has there been any talks on buying a search engine of the SAFEspace? I bet it could be a very interesting thing becoming the first Google of safe!

I really like our GUGL :smiley: :smiley:

1 Like

Not on buying one… but we’re talking about maybe building one.

I’d be down for giving some a go. I’m a frontend dev, and I could foresee making a simple setup for the current safe sites, for example. Or even most of early ‘safe’ sites.

But larger scaling and data storage types on safe… that’d take some research. Not unsurmountable though.

2 Likes

There is nothing stopping google installing the safe net proxy in front of its crawlers. Google could enable indexing on safe net within days if they wanted to.

Moreover, they could integrate what has been discussed here quickly too. They wouldn’t even need to limit it to safe net URLs - they could just use Safecoin for micro payments and identity conformation (if required/offered).

3 Likes

Thanks for the reply, yes I meant building, stupid corrector :)!

Years ago I bookmarked this search engine because studying it at that time got me interested, but I lost sight of it afterwards …

It’s a DHT based P2P search engine developed by a team ten years back and still running.
I’ve tried it and it’s actually fun 2 use and this could be an example of what safe search could be …

Okay I know, it’s closed source(parts), but worth the try just for interest…

Faroo Search Engine

Faroo Blog

Faroo FAQ

Faroo Alexa Rank

6 Likes

It is lightning fast.

2 Likes

For those interested in search on SAFEnetwork I’m waking up this topic and want to point out that Paul Frazee is a source of interesting ideas right now (he’s the guy responsible for Beaker Browser that @joshuef forked to make SAFE Browser).

Here’s a short twitter thread which Paul wrote which may be relevant to search, I’m not sure because I don’t fully understand it. Also check out this blog as he’s sharing ideas there too :slight_smile:

https://mobile.twitter.com/pfrazee/status/879430133815402497

6 Likes

Interesting indeed.

I’m still vaguely trying to update my POC to MD (time, where is the time!?). If I ever get this out, there definitely huge potential for optimising the data structures for search/semantics.

Good jumping off point for me to read more! Thanks @happybeing.

5 Likes

@PaulFrazee can be tagged as well ;-).

5 Likes

Okay I’m just having a thought here so bear with me if this is kind of random. But what if we kept any A.I. really simple and let people do the sorting. Look we have tons of sites that sort links, pictures and videos into groups: pinterest, youtube, pearltrees, delicious, etc etc. People love collecting and organizing stuff. People also love sharing stuff. But human beings what they are don’t usually collect click bait and spam. I mean I’m sure someone does but it’s not the norm.

So how about this:

  1. You have different collections of stuff, links, pictures, videos, content in general.
  2. You then create a client side app that allows people to weight the reputation of each collection. One could also write notes in the app to remind oneself on what the group was like and why one weighted it as such. One can, if one chooses, share one’s notes publically, but by default they are private.
  3. The app then compares all the weights given by all users for each collection and gives the average. So if you had a group that gave a 9, a 5, 2, 10 and 1. That’s 5 users so that collection’s average weight would be (9 + 5 + 2 + 10 + 1)/5 = 5.4 If you had another collection that was 9, 4, 2, 5, 2 then the result would be (9 + 4 + 2 + 5 + 2)/5 = 4.4 and depending on whether your reputation system added value or subtracted it with the more points you got one collection would outrank the other on the public search list.
  4. You could even do something similar with bookmarks for individual links. If a user bookmarks something then it can also be thrown into a personal weighting system. In addition links that were added to more and more popular collections would also be given more value. There would be no need to share who saved what bookmark. Just that it had been bookmarked and it was worth x weighted value.
  5. We already use hashtagging for keyword searching on social media why not incorporate that into the search engine? Yes we can also use classic search engines for searching the body of the text and so forth but why not treat web search more like social media search and at the same time make social media searching more accurate?
2 Likes

Would a recommended websites part of the search engine be a good idea? Like how youtube suggests videos to people?