NRS Brainstorming Megathread

Good idea, each browser bookmark could capture this or some other version related metadata to help support browser features that assist with the good and bad effects of versioning.

Maybe there’s scope for innovation and some cool bonus features?

1 Like

There is the big picture here still and that is a web page relies on all linked files (scripts/images/etc) are not changed under it.

To me it is not sensible to take a web page 3 versions old and use latest versions of the javascript files. Someone may have linked a particular version of a web page because it does what was wanted.

Also if one is going back thought the versions of the web page then that version needs to be linking to the correct version of the linked files it loads.

All that to say it is not so useful to goto a web page then cycle back through previous versions of images or java script since that would potentially break the webpage. The user would typically only go back through the webpage versions without interfering with the versions of the files loaded by that webpage version.

To my understanding it needs to be spelt out to webpage developers to have any link for files it relies on to have the correct version included in the link. And I’d expect this becomes second nature to website developers the same as them learning URL/URI formats


There’s an argument here at least for a ‘developer mode’ in the browser which catches uncertain links which may cause a problem. And for a tool that automatically updates the version in links etc.


Going back to my earlier point (and again, please jump in and correct me if I’m off base here), one could be linking to either a 1) mutable/appendable linking to a data blob, or the 2. data blob itself (xorurl).

In the case of 2. Problem solved.

So would it not be as simple as telling developers ‘don’t link to someone else’s link to the thing, link to the thing itself.’? (unless you actually want to link to their current version)

Should be as simple as including somewhere a ‘get ultimate resolution for this link’ (e.g. returning the final xorurl) button no?

1 Like

At a stretch, some unique url could be crafted that is the collection of files with versions set… just a filename sort and noting of versions would do that… unclear if it would be user friendly as nrs like…

1 Like

The browser could simply refuse to load any safe:// link inside a page without ?version=.

It would still load them from the urlbar without complaint.

I think that mostly solves the archived page-cohesiveness issue. But there could well be issues with non safe:// urls, and maybe other stuff I’m not thinking of.

If we really care about a “permanent” web, something like this will be necessary I think. Otherwise, pages can start to break quite quickly.

1 Like

Sorry but surprised that’s coming from Maidsafe direction … as it’s allowing only for the literal use of urls… there’s more to the use of links than what is permanent; an equal interest I would argue, in the art that is the ambiguity of links and how those relationships evolve.

If providers want to see their content respected then they have an option for that but there is no reason to impose on those who want to make use of the way links will change over time.

The option perhaps is that there could be a browser setting that enforces permafrost and cold brittle interpretation of links but I would encourage flexibility up front.

1 Like

I agree that often/usually you want to link to the latest version of a given website, or page within your own website. But also it is desirable that the present page remains cohesive/archivable. (debatable?)

In that light, perhaps a distinction could be made between “things that are used to build this page” vs “things that this page is linking to”. And allow unversioned or some kind of explicit (version=latest) for the latter.

So to illustrate the idea, <a href> would allow unversioned, but <img>, <script>, <link>, etc would not.

Obviously some careful analysis would be needed here with regards to specific tags and also ajax type requests.

again, I am not proposing we do this, just brainstorming around the issues being discussed.


but not all use is endpoint. Much of the web is the interlinking… and perma makes for stale.

Consider that right now, there is no permaweb… so, you’re talking back to front from what is. What you’re adding is above and beyond the normal.

1 Like

@davidpbrown I’m not sure I understand your point. Can you elaborate?

edit: In my previous post, I suggested specifically exempting the interlinking (<a href>) case from version requirement. ie:

I agree that often/usually you want to link to the latest version of a given website, or page within your own website

In that light, perhaps a distinction could be made between “things that are used to build this page” vs “things that this page is linking to”. And allow unversioned or some kind of explicit (version=latest) for the latter.

Defaulting to a control of what others do rather than defaulting to the normal they know currently?.. Allowing everything, is different from knowing what is best.

The option for those who want hardlinks is there. If you’re keen to encourage it and granted there is value in static, then that can be encouraged with dev tool that alerts to broken links etc… or as an option they choose to adopt.

If you create something brittle, it’s going to be a pina to work to. Defaulting to the latest version with the standard unversioned link, seems the right way to go.

Aside from some will want flexibility, many I wonder will want this for moving existing sites over from copy they already have… without having extra work adding ?v=0 to all links, or if automated wondering perhaps resenting the interference, for zero added value to their interest.

The more substantial reason perhaps is the cost of this unnecessary… if someone updates one element of their site, they have to upload all that links to it?.. just to update the version number in the links.

I would much prefer these kind of enhancements to what we know, are made as options, in the tools for upload… turned off by default but with a strong encouraging that yes, Safe Network is different and here is an option you might like to turn on.

You get better engagement when a new idea is owned by the user, rather than imposed from those who know better - for all the good reasons.

1 Like


Every software protocol or specification defines what is valid or not, ie allowed, within its domain. This is why it doesn’t work to speak http to an smtp server, for example. Software accepts or rejects input according to the spec. My use of the word “allow” is in that context.

Let’s take a step back for a moment and see if we can turn this conversation in a more constructive direction.

@mav raised the issue of loading pieces of a page (css, images, scripts, iframes) with unversioned links that can lead to the behavior of the page changing over time, even if one loads the page with a versioned URL. I’ve been calling this page-cohesion, or lack of it. ie, a fully cohesive page would not suffer this problem, but one with any unversioned links to page-components would be susceptible. I see this as a problem for the Safe Network’s goal of a permanent web.

So, questions to think about would be:

  1. Do we agree this is a problem?

  2. Is this a problem worth trying to solve?

  3. What technical solutions can we think of to solve the problem?


*Safe Network’s goal of encouraging a permanent web?..

Of course but “allow” is like yes/no questions, it either allows or does not.

Better to give the option of…

  1. Do we agree this is a problem?
    The internet as it stands, works.

  2. Is this a problem worth trying to solve?
    Given there are significant advantages in a permanent web, yes but it is better framed as an opportunity for the future - a feature - and one which sets Safe Network apart from what went before… there is no such simple option to fix the current internet for page-cohesion.

  3. What technical solutions can we think of to solve the problem?
    Above I wonder there were solutions… coupled with seeing all links versioned and then perhaps declaring the page as “stable”, there could also be some kind of unique reference to unstable sites that snapshots the versions of linked files as a collection, where the links are not versioned.

  4. What are the downsides of imposing this feature by default.

  • Cost in time and money to the user of any change to element requiring also update to the links to that.
  • Loss of choice for those who do not yet appreciate the idea of permanent web.
  • The transition from what is to what will be, is simpler for users, where there is initially no difference.
  1. What’s the best way of encouraging users to permanent web, if it creates more work and cost for them.
    Sell the idea and the benefit, get their buy in relative to their use case. If they are storing data that will be in any way reference, then they should expect this will greatly benefit everyone that it is versioned.
    For those transferring copy of site, some automated search and replace for urls with ?v=0 would do.
1 Like

I think we need to keep perspective here. Fact checkers, researchers and writers of technical articles might be interested in previous versions of web pages. Most everyone else (95% ?) could care less, they just want to see the most current page. Optimize and facilitate easy navigation for the masses is what I say.


We are building a new network infrastructure with no installed base of users. That is the best time to fix things we perceive to be broken/problematic that we inherit from older designs. Is this one such? maybe… seems worth discussing anyway.

Anyway, I think I’ve laid out my thoughts on this well enough. I’ll wait to hear from others.

edit: with regards to costs of keeping the links updated, web devs could easily run a tool that updates links to latest version.

This is actually pretty similar to how apt, cargo, composer type build systems work today, where a manifest specifies dependencies and then a tool grabs latest matching version.


Until one day they do care about something, and are very thankful for the wayback machine. (Or very sad they couldn’t find that thing they remembered from 5 years ago.)

  1. Do we agree this is a problem?

Need someone arguing for permaweb…

So, if I link to something “this is great” and it’s mine, then that works; if I link out to another page, then that could be controlled with a sum of what versions were within that page.

The problem perhaps needs noting, that if a page I link to, is linking out, then there is a cascade of risk that it will change.

That reality is familiar to everyone.

The question perhaps should be, is the cost of fixing it worth the difference - which is those who want it, linking out to those who do not care or do not want it. They could take a snapshot and own it for themselves. (yes, there will be those who don’t know they want it till after but minor case?)

The cost of fixing, it seems to me, requires extra work and cost in uploads for ALL users, just to cater for those who appreciate it.

5 years time the data will still exist… if the form of a website is not exactly the same, then the elements are still findable. Even if that is a chore trawling through the variety of those and the links out through other sites. The only loss might be a confusion on what elements were used in parallel with others… (but if you needed that you would also want time stamps too or it’s a minor case again.)

atm it strikes me the cost in time and money for forcing everyone to conform, outweighs the benefit.

By default links from a document to external materials should definitely be “immutable” (ie versioned), just as they are today in the Wayback Machine. A historical document is a snapshot of the past and if its elements are allowed to change in an ad hoc way it would be like looking at an old family photo in which, depending on when you happen to look at it, certain people have disappeared because they have since died. Of course the author should be able to change their mind and make links “mutable” (point to the latest version), but this would create a new version and maybe someone can create a tool for journalists and fact checkers to easily spot such changes (which would probably be quite unusual).


I’m not saying access to prior web pages should not be provided; I’m saying it should not be a priority when designing ease of use. It should be treated as an extra element of the Safe Network that sets it apart, a cherry on top. But let’s not sacrifice the main meal for the sake of dessert.


I don’t see it as antithetical to ease of use. Seems perfectly intuitive and doable to me.