SAFE URL: "safe://" cross browser support revisited!

Problem is clashing with existing TLDs (if “.safe” isn’t allocated it would seem likely to go as it’s such a nice property in this climate, but didn’t someone say it was allocated?) or choosing something that subsequently gets allocated. Even “.safenet” risks that I think.

Choosing something silly like “.onion” is some protection, but can we think up anything that we’d like to use? And it still risks attack by this method.

IMO this is another argument for using a real website address, because it would be much cheaper to own than a TLD - which also means we could easily use many rather than just one. So, maybe a :+1: for “mirrors”.

1 Like

If we use SAFE: then we can use any tld without a technical clash.

1 Like

I would also add that SAFEnet is IMO a new internet and thus we need not be concerned with compatibility with the old … further and more importantly, those with existing brands “domain.tld” would be able to use their same brand with SAFE: — that’s a huge plus.


Yes making a whole new internet protocol “safe:” instead of “http:” is very important and that’s the project that I thought I was backing


I think the point is to provide a way for existing browsers to access the SAFE protocol.

This is not defining the SAFE protocol but providing a bridge for existing browsers to access the protocol. A way to tell the browser plugin that this old style URL is to be translated into the safe protocol.

If a browser refuses to accept a new protocol method, then an alternative is needed. So we need a way to bridge the browser to the SAFE protocol.

1 Like

This is true.

When the Network is not relying on FF, IE, Chrome, etc., there will be absolutely no reason to use safe: or any TLD’s.

This is just because Chrome wanted to piss us off, and they’re doing quite a good job of it.


Read the fourth post in this thread. This problem has been addressed. “SAFE:” works.


Not according to the devs.

Which I argued as well (in the thread that this was split from). I would request that they revisit the matter in light of further google search results.

Or else explain their reasoning if there’s a gaping hole that we both overlooked @TylerAbeoJordan


Maybe @dirvine would weigh in? Or whomever is ‘in charge’ of this. Would be nice to have a full discussion and resolution to this problem/issue i.e.
[web-server + custom protocol handler (SAFE:)] --> see post four of this thread.
[plugin + fixed tld (.safenet)]
[custom browser]


Given the 19th of Jan dev update, it seems @Krishna_Kumar is integrating the proxy server idea which, I believe, will intercept all requests using a particular tld (e.g. .safenet).

I suppose then that if we want to use a “SAFE:” protocol handler, we may have to build this separately with a separate web-server system … or our own customized browser.

1 Like

If we are focusing on the new browser then it means we need to build html / css parser, and then build a javascript engine or a new scripting engine. I found a blog that starts on building a new toy-browser. It is a toy browser but it could lead to something better.

Or… why not just use servo? It has android support. Servo supports https:// and file://. No doubt that safe:// and ipfs:// can work as well.

Or we could take redox approach, everything is a url.


+1 for servo. That would be freakin’ sweet.

Still not-so-privately pissed about the integration. Bold move Maidsafe (the company)…bold move.


Sorry for the delay in replying, I missed to follow the forum yesterday.

For sure. I will sum up my findings here.

If the local config file is edited as said in the post, this makes chrome to look out for external applications to handle the URL. Which means that the launcher should register itself for handling the SAFE: scheme. So this forces chrome not to throw an error and pass it to the OS to handle it.

So for discussion, let’s consider that Launcher registers for the SAFE scheme with the OS.
Now, when the user keys in this will be passed to the registered application for handling the same. Launcher gets the request and fetches data from the SAFENetwork for the URL, how will the response be passed back to the browser?

Other browsers/applications can also start sending requests through the registered handler, making it more uncertain to determine the origin of the request and also to reply back to the request. (Let`s ignore this for now)

Assume that we can pass the data back through an extension/addon which is already installed. We can make use of native messaging API of chrome for exchanging data with launcher and addon. At this point we are not sure on which tab the response message is to be sent. Even if we find a means to send the data to the correct tab, we will be forced to clean the DOM and render the received response (consider it is a plain html for simplicity). The html can have resources linked, such as css and javascript, even these resources can be fetched and injected into the DOM. But places where there are inline definitions for images, we must be able to identify and do the needful to serve. We won’t be able to support relative path referencing.

Moreover, the data (main html) is injected via scripts from the addon. The features like local storage wont work across applications. This happens because the browser doesn’t consider this has a page load and the hostname would be or the previous site from where the safe URL was invoked.

Summing up few ambiguities,

  • How will the launcher identify the source of the request (this is just a invocation request)?
  • How will the launcher serve the data back to the actual origin?
  • How will the resources be handled?
  • Can we provide the basic features that normal web does?

A similar problem exists with Safari.

Am not sure what plugin means in this context. If it is NPAPI plugin, then chrome doesn’t support it anymore.

Any suggestions for alternate tlds? I know you are not for the TLD approach :wink:. But how about we have a fixed hostname, something like http://safenet/mypage.html, http://safenet/blog.mypage.html . This one definitely hampers the readability of the URLs, IMO. @happybeing also had a suggestion of having a real domain, to help in case the users don’t have the setup for SAFENetwork yet.

I have not tried magnet links, but I assume that they also work on the same basis. And I can see that it is not consistent on chrome probably because of various reasons (lack of permission to edit the chrome local state file).

It is good that I finally summed it up, because it would be easier for you guys to help us with your experienced insights.



I understand why the TLD approach is being chosen, but I’m not happy about it at all. I’m mostly afraid we’ll forever be stuck with it as TLD links are used and stored everywhere in SAFE apps and people’s minds.

Why not just bundle a rebranded Firefox (including required plugins) with the installer? If people are adventurous enough to switch from their current solutions to SAFE, using a different browser to visit SAFE web content isn’t that big an ask. Especially if it’s included in the installation.

A single dedicated browser has many security benefits as well. Using the same browser for both the regular web and the SAFE web practically guarantees cross-contamination and security issues. Simple example: Javascript code in a SAFE website that creates a direct connection with a regular server, de-anonymising the visitor. This can be solved easily in a dedicated browser (disable any communication other than with the SAFE launcher service). In a regular browser, especially restricted ones, this may not be solveable at all.

I fear the virtually guaranteed security holes with non-specialised browsers are going to hamper SAFE adoption’s a lot more than the non-optional usage of a dedicated browser.


Brilliant. Simply brilliant. I much appreciate the insight and thank you for the time that you took to respond to us.

1 Like

My view of the matter is that this will only be used for backwards compatibility with existing clearnet browsers.

It is unfortunate, but necessary for mass adoption - I do not share your particular fear there.

I am much more interested in sandboxed native applications. Or at least a sandbox that can accept plugins for all types of display purposes.

Do one thing and do it well
– Unix Philosophy


That seems to be the general opinion here, but I haven’t seen convincing arguments for that.


I’m 1,000,000,000% for this.

We shouldn’t falter in the face of Google chrome etc. We should stand firm and proud in our stance as a new, secure, independent internet protocol.


This isn’t really about not being proud of our protocol, I don’t think anyone could accuse any one in the community of that :). The way forward that @Krishna_Kumar proposes is really about having a solution that works quickly, is maintainable and works across multiple browsers. In my view we have to make switching to SAFE as easy and painless as possible and IMO a dedicated browser is not the best way to achieve this.


@Krishna_Kumar thank you very much indeed for this. I know you are very busy so I really appreciate it and think it is very valuable for the project that we have such an approachable and responsive development team. It warms my heart. :slightly_smiling: