SAFESearch - Search Engine

Yes but Linux over Windows in this case. But I do see. Try to replace unacceptable human govenrment with human made code and choose the self modifying variety and end up with an increasingly unintelligble universe with an increasing number of physical laws that are increasingly unintelligble. Chocolate ice cream becomes impossible but vanilla still works.

great exciting stuff. Yet no answer to my simple question :

is anybody actively developing a search engine for SAFE?

Read this thread:

Hope that helps!

A direct answer to your question is - possibly. No official announcements so far IIRC (if i recall correctly). :slightly_smiling:

I’ll go ahead and say a search engine exists as a brain-child of David’s and will be implemented if necessary:


This idea that one of David ideas is a fall back strikes me as odd. Maybe its just not as strong of an interest for him?

Hit initial response actually explains that pretty well:

1 Like

A comment from David regarding development of search in SAFE:

Search though from a decentralised perspective can make use of lucene type indexing, with the indexes using using Immutable Data, this has to be arranged in a manner where it can append information as more links and data is found.As data can be referenced via a datamap hash then the end data will always be found (like way back machine inbuilt to the network).

So the research part if you like will be fitting this in a type of binary tree/Directed Graph where leaves can be updated to reflect new information that fits that part of the object (graph).

So very possible and a few whiteboard sessions could product a crude first attempt at this. If done correctly and without doubt in the open then I think we can have a decent, albeit crude search.

More interesting though will be the use of deep learning algorithms to provide more than search, so more like wolfram type question engines with larger data sets. I spoke briefly with Wolfram and I am sure there is a very long conversation to be had there, with the new data structures available in SAFE as well as their capabilities in domain specific languages in this field.

1 Like

I wonder, given ‘human’ indexing, and the problem of Proof of Unique Human, if spam could be alleviated by the first (or maybe every rating) incurring a fee in safecoin? That could/would hopefully prevent mass spam scripts if the cost was high enough.

Given that, the problem would be finding this sweet spot with the search engine; though I imagine if it is human powered / spam free then it would be worth the initial outlay for users who value private search etc.

This cost could be redistributed to the search site’s indexer team… Mo rep mo money. Take some cut for the dev (open transparent amount). Boom? Everyone wins.

Ha. I sense I’m being too idyllic here.

Incurring a fee, not sure I understand? Can you explain?

Pay to submit a link to the engine. If your reviews are good and valid, youd garner that back over time. Perhaps even make koney off of it. But would stop account creation and just ‘reviewing’ your spam site.

Ahh OK I see what your saying but I doubt it would work. People have to buy domains and hosting currently, doesn’t stop them from spamming also on SAFE net they will pay to PUT content anyway.

Those are both one time costs.

Right now there is no cost associated with spamming amazon reviews for example. So this would be some kind of every review cost. This could decrease over time for normal users. It’s kind of a ‘proof of work’ type thing, to prevent automated bot submissions etc.

I am not sure there is a spam problem to solve here. Search engine bots crawl over everything they can find. I suspect this is because the search engine would rather have as much data as possible; it is in their interest.

Maybe you could pay for an increased crawl rate, but that would start to shape the quality of data towards deeper pockets. This may put people off using the search due to bias.

Wouldn’t a traditional crawl bot be decent enough on safe net anyway? Maybe there is a better way with the new technology available, but the current approaches would seem fairly transferable.

Edit: saying that, reviews of links/sites by users could be good. These could have a cost and would improve ranking potentially. Too much to spam, too little to worry about for genuine reviews.

1 Like

Hmm. Yeh a crawler is definitely useful. But refining indexing and so forth is a big task. One that requires money and/or a lot of time / infrastructure (and in the end, I guess: ads / google / black box company type shenanigans).

To clarify my thinking regarding ‘spam’ here:

Right now there is a problem with marketing content, vs content creation. Or: SEO sites with not so useful copy. Or stolen copy. Scraped. Etc.

This ‘spam’ ideally would be discouraged in a system that uses user reviews of domains… (stack overflow style perhaps).

It’s a problem as noted over in the PtP threads re: piracy etc. How do we find the original creator? Who does that and why would they bother vetting these sites.

Some system of user approval, backed by a crawler as you note, might be able to achieve this… And this could ideally be reflected in search rankings.

Upfront cost for submission/rating discourages automated gaming of the system (would have to, or it’s useless).

This cost could be rerouted to the community. A split to devs/ users. If you’re amazing at rating / categorizing etc, you’d get a bigger share. If that’s reinforced by other users, then you get a bigger share (reputation system). Ideally over and above any submission cost.

The search would be weighted accordingly. Ad free. Cost free for visitors (outside of GET rewards, also routed to the community pool).

This would be manual labour. But that’s kind of the joy of it. Too much of the internet today is low quality / SEO / clickbait crap (IMO). In order to get views in order to get ads. And the ‘quality’ of that is determined by google.

This sort of setup could provide relevant, interesting content. And there’s no central unaccountable, profiteering corporation controlling what gets to the top (with open sourcing all the things).

I’d love to be involved in that sort of setup of a SAFE search engine. I’m really for something ad-free and community driven.


I like that idea! Not really possible without micro payments, as it would get either spammed (with none) or not used with high fees.

Having people give context and reviews of sites would be very useful for searchers!

1 Like

Has there been any talks on buying a search engine of the SAFEspace? I bet it could be a very interesting thing becoming the first Google of safe!

I really like our GUGL :smiley: :smiley:

1 Like

Not on buying one… but we’re talking about maybe building one.

I’d be down for giving some a go. I’m a frontend dev, and I could foresee making a simple setup for the current safe sites, for example. Or even most of early ‘safe’ sites.

But larger scaling and data storage types on safe… that’d take some research. Not unsurmountable though.


There is nothing stopping google installing the safe net proxy in front of its crawlers. Google could enable indexing on safe net within days if they wanted to.

Moreover, they could integrate what has been discussed here quickly too. They wouldn’t even need to limit it to safe net URLs - they could just use Safecoin for micro payments and identity conformation (if required/offered).


Thanks for the reply, yes I meant building, stupid corrector :)!