Multiple ways to build the SAFE Network browser extension!

Could disambiguation work without a centralised directory? Forgive my inability to start thinking distributed! One day, one day… :slight_smile:

EDIT: come to think of it, how are MPIDs turned into a data address in the first place? Coo [excitement at learning… ]

Yep it’s unique, so this would be how it functions right now with our current requirements.

Why? All this is doing is adding another Key Value to associate the numerical entry to the chosen string. If people followed the same and gave the same string for their MPID then they have it sorted right at the start. MPID isn’t just used for Website’s btw, it’s your public-id essentially the only part of your credential you disclose to anyone in the network, so things like messenger/contacts list private shares, they all require a MPID.

Personally I’m not a fan of any directory structure at system level for this mapping. That’s why I don’t prefer a DNS at the system level either. If you make it an app level feature, then you could have an app that creates it’s own DNS format based on it’s requirement and say “app-name::something” should forward it to some underlying “safe::public-name/something” now if that app decides it can switch the forward to another underlying address based on how it see’s fit(whether someone bought it for safecoin or another method) but the system keeps itself detached from these intricacies at it’s lower level.

This way you let DNS be defined by apps how they want to and if their user’s deem it’s doing the right thing or not and whether they want to use that method.

I can try and answer this one from my understanding for a higher level, MPID like any other key in the system at the end of the day represents a hash or a unique location which when queried for in the network get’s you what’s at the addresses based on the type of message being queried for. So from a browser addon when you query for someone’s MPID, the system could essentially reply with the DirectoryListing object that corresponds to that person’s public share related to web content. That directory listing would then hold the DataMap’s which inturn holds the chunks for all the pieces of data within it and thereby gets you what you need.

3 Likes

The MPID address is the hash of a name so Hash(David) will contain the public key of (hopefully me) David, it is first come first served t the moment. There are options, but this one seems to mean it is first served.

It may be better to let folk use any name though pass their MPID name with that where the MPID name is a 64Byte number that is unique to them. The latter way is more secure and allows duplicate names as per facebook etc. Harder to find folk as they must have passed their ID somehow or you have been given it by somebody else.

4 Likes

Anything is possible, but here’s some questions.

Which search engine should it be submitted to? I’d assume there would be multiple competing search engines.

After a site was submitted to a search engine, would it have to resubmit if a new search engine was created? How would it know?

I liked your idea of having submissions of public webpages automatic, just don’t know how to go about making it work long term when things change, as they always do.

1 Like

I like this approach. Deanonomizing and making oneself findable should require effort on the part of the user. It can be made easier by apps, etc., but it shouldn’t be default or something that can be done “accidentally”.

2 Likes

Mmmm, I think this would defeat URL readability, as each URL ends up with the 64 bit number where a domain is currently used. So…

Everyone would need an app/browser to translate readable names to 64bit numeric MPIDs, and this making is either defined by the user or from some reference DNS like directory. Lazy users would be bound to lap up the latter, re-creating the DNS centralisation risk that SAFE only just solved! Or have I misunderstood?

Quick correction - it’s a 64 byte string. Which would be a 154-digit decimal number. Or best-case for human-readable format; an 88 character base64-encoded string. But yes… not great as a URL :smiley:

4 Likes

How about all of them? As soon as a page gets created it automatically submits to all available search engines.

Automatic submissions would be convenient.

Though we have to be careful about automatically listing webpages upon creation. I think requiring user “consent” to be listed on a search engine APP is respectful even though the webpage is public. The user may be debating on details of the webpage construction, or deciding other factors before they want it listed. At least, give them the choice of when to list their page.

One click “Submit to Search Engines” on a Listing APP should work. If they remodel their website, they can just resubmit.

Some major search engines use spiders to crawl the internet going from link to link without asking permission to index a website, so my work ethic may be moot on this regard. But I still prefer personal consent even in a public space. It’s just good manners.

1 Like

Yes that is how I would do it. I would make it a checkbox and if they check (submit to search engines) then it would submit to them all. This way we never end up with the situation that Freenet, Tor, or the regular Internet had where the majority of information isn’t indexed and impossible to find.

Brilliant post, thanks Viv. I’ve added these thoughts into a related post

2 Likes

I’m not a huge fan of this solution. We’re putting ourselves back into one of the main complaints of Bitcoin addresses - they’re incredibly unwieldy. 5 years in were still trying to solve this problem.
I do see two upsides to this solution though. It prevents name squatting, and is absolutely more secure.
Looks like I’m more torn on this solution than against it. I just think not being able to verbally communicate your “address” will be a huge barrier.

1 Like

The supposed method for website owners to indicate that their preference is by placing a robots.txt file in the root of the website, containing some geeky “index this but not that” coding.

However, search engines don’t always honor this.

Sure you can submit, but they may not necessarily want to scan.
Hint: namecoin.

Has the browser extension idea been taken up as yet?

1 Like

Has anyone pursued this?

1 Like

I sure hope so.

But even if nobody has yet, it’s inevitable that somebod(ies) will once the network is up and running and gains popularity.

So at least the ground work is here for them to build upon and get started with :slight_smile:

We were discussing this again last week and are looking at a Firefox extension enabling that browser to run inside the SAFE Network. This will enable users to access SAFE using tools they are already used to while increasing awareness within the Mozilla community and leveraging their huge user base. This could of course be done with other browsers and we are focussing on Firefox initially as we have contacts there and have been committed some engineering support from them. With tangible progress being seen on the NFS API design, this is something we will hopefully be able to start in earnest in the New Year.

9 Likes

Please nothing modal, and let it block everything modal when in use (to the extent possible for a browser.)

Its nothing we don’t know but there is irony in this effort even if its necessary as a step. Even as other alternatives short of a SAFE browser are not present at scale, there is still irony because Firefox and Mozilla are so compromised with the wishes and influence of ad/spyware groups. Basic interface stuff like, back, forward, stop, scrolling- all of that has been corrupted. Surely it does somethings right but every time it upgrades it seems to be about dis-empowering end users and helping its sponsors beat ad block etc. And yet its group is always trying to suggest they are free and open.

Can you send through the links showing Mozilla is “compromised with the wishes and influence of ad/spyware groups.” @Warren. My understanding had been that they deliver search traffic to Google (and now Yahoo) to earn their funding but your comment suggests something more sinister. While Firefox is targeted (because it’s popular) with unwanted extensions and adware there is nothing to suggest that the Mozilla Foundation are involved.

2 Likes