NRS Brainstorming Megathread

Is one simple option, an in browser option to ignore versions on links?.. that would then be user adopting the risk of the difference.

Most users I wonder will want to view the latest version of everything. For all the interest in permanence, will insisting on that for all, be a disadvantage to adoption?

Just trying to cover all bases… as it’s a different kind of experience. Links will go stale, if they are not pointing to the latest versions??

Should the exception be that links that are only the domain, need not have version… because as that, there is no risk?

1 Like

Yes, of course. When we navigate to a site we see its latest version by default. It’s only when we select a past version we need to see the content as it was in the past.

No risk, because if an item in the NRS map is modified then its version will be modified. So navigating to previous version will show item from previous version.

2 Likes

No.

Links never go stale on a permaweb.

Not sure what you mean here. Everything (meaning data/content) should have a version associated with it. As you mentioned before, if one isn’t explicitly stated than the only reasonable option is to fetch the latest version.

They do, if the intent is to link to the latest version and that not an option.

but…

The exception perhaps could be that links that are only the domain, need not have version… because as that, there is no risk?

That would be a simple exception to cater for where links to domains is an interest. Everything else gets explicit version.

What makes you think that could not or would not be an option?

I thought above with tfa’s input, fairly resolves that everything should be versioned for the risk of embedded content.

The exception only that links to top level domain, can be unversioned because there’s no associated risk.

A Modest Proposal

Motivation
It is useful to be able to link to latest version of another website, so that if that target website changes, all link(s) to it from other sites do not have to be changed somehow.

However, for purposes of viewing the web as it was in the past, that is something of an anti-feature. For example let’s say I link to cnn.com, which updates every day, at least, and I leave my site alone. Fast forward 10 years. Now, let’s browse the permaweb starting at my site, the version from 10 years ago. We click on cnn.com, and we get the latest, current version. Or maybe cnn.com stopped being updated 3 years ago, we get that version. We do not get the version from when I last updated my site 10 years ago.

By contrast, if my link contains the cnn.com latest version at the time, then one can sort of step back in time and surf the web as it was when I published.

So what would be nice then, if we can combine both of these behaviors depending on the end-user’s present usage goal. ie: does user want the current version of linked sites, or do s/he want to experience the web as it was 10 years ago when I last published?

Proposed Solution

  1. All links between sites must be versioned to be considered valid. and.
  2. An optional attribute preferlatest may be added to <a href> tag. and.
  3. The user-agent (eg web browser) should offer two modes:
    a. normal browsing.
    b. history browsing.

Description
In normal browsing mode, the version number is ignored/dropped when a <a href> link is clicked, provided the link has preferlatest attribute. If link is missing ?version, then it is considered invalid and is not displayed as a link. (This is to prevent lazy authors.) If preferlatest is missing, then the version parameter in the URL is preserved. This enables linking to historic versions of URLs even in normal browsing mode.

In history browsing mode, preferlatest flag is ignored and the version number is always preserved in the url.

Summary
The combination of a) requiring that version always be specified, b) providing option preferlatest flag, and c) implementing normal and historic browsing modes in user-agent, allow the web to be viewed as a dynamic ever-changing thing in the present, but also as global fixed snapshots in the past. This is powerful and has not existed before.

Final Note
An even better experience could be obtained for browsing historic versions of the web eg on a specific day or even hour/minute in the past (without requiring linking sites to include any version attribute) if there were a way to associate each version of a page with a timestamp. However, afaik, the design of the Safe Network has no concept of universal time, or any timestamp mechanism. As such, usage of versions as described above may be our best technical means for historical snapshotting.

8 Likes

Bingo!!! :clap::+1::+1::star::star::star::star::star:

That’s what @tfa and I have been pushing, just not so eloquently stated.

There is no need for two modes, it should be a seamless experience based on the originating version.

Also, seamless Private versioning (as an overlay to the public versions) allows Safe Browser clients to snapshot one time per client dynamic content so their personal recollection is honored.

2 Likes

I do not follow your logic. Can you elaborate or put forth a counter-proposal?

I think he was suggesting it should be seamless when we browse about the safe internet.

if you receive an article from May 1st 2020. It is now 2023 and you click on the link. This link being a permanent historical site will automatically take to you in to the May 1st 2020 permanent web. The browser would indicate you are historically browsing kind of like when you are private browsing. There would be a visual indicator letting the user know they are browsing in the past.

Now you check your next email. It has a link from today’s present day web. When you click on the link it takes you to the current safe network site. The browser again indicating you are now browsing current real time.

I think this is what he meant.

1 Like

I don’t really see a need for versioning all the links in the html unless a specific older version is desired. I think that when the new version of a page is published, all externally linked content could be queried to see what it’s latest version is at the time of publication, and that info can be added as metadata in the NRS map or wherever the newly published page’s version info is stored. That way it’d always be possible to see what the state was at the time of publication via @danda’s history browsing mode.

Which part? Historical origin or private overlay?

If you are referring to normal vs history mode, then see Knosis comment, he understood exactly:

I was simply stating that the user shouldn’t need to activate different browsing modes in some clunky way. If they are on a page and want to check a previous version, then consider a small icon in the lower right corner of the browser where the version is selected. That sets the historical origin. As Knosis pointed out, browsing to different links from that origin is then a walk through memory lane. Each page visited from that point forward might show the specific version being viewed in the small icon on the the lower right instead of a ‘latest’ icon.

If instead you were confused about “Private versioning” see the discussion above that describes the use of private versions to snapshot client generated content that results from edge cases involving dynamic or randomly generated content (ie. Math.random.()). This covers edge cases where content is different for each page view by each and every client.

If no one had suggested this I was going to. Have the version at time of creation of webpage and then attribute to suggest latest as preferred thus allowing the user to choose to see what the webdesigner saw in case the new website reversed their views or whatever. Mainly thinking of images/media here but can refer to links too

I think a cool pWeb browsing experience could be that all sites/content are current (newest versions) and there would be a calendar tab somewhere on the browser that would allow you to dial to a specific date and the see the whole web as it was at the end of that day.
As we saw in the mobile browser, there were back/forward arrows in the url address bar that indicated whether there were previous versions and allowed you to go back and forth with ease on a per site basis. I thought this was very simple and intuitive.

I can also see the vision @Knosis has as being really useful and novel as well.

1 Like

To do that you would need to timestamp each version when published. Iirc this would require section consensus on what the current UTC time is although maybe someone knows a clever alternative.

I agree that being able to get a 1:1 correspondence between version history and calendar history would be rather nice and yield a much more intuitive experience. Maidsafe via dirvine have been pretty adamant about not using wall clock time as a parameter. Maybe they are fine with version timestamps?

1 Like

I believe this has been discussed before. Instead of some global time stamp the local machine can provide a time. Not perfect but better than nothing or introducing time.

Though that causes an issue doesn’t it? Someone could lie about when something is published by altering the time and date on their machine. Meh

That’s fine. That is a nice UI shortcut for choosing historical browsing mode. Internally, there must still be the two modes though, so that the browser knows how to handle links. And of course there must be a way for user to get out of historical browsing mode. These are UI level details that would be up to implementors.

1 Like

yes, I addressed this in the Final Note of the above proposal.

In a bit more detail: Anyone that ever used CVS may remember that one could checkout all source files by date. This is also possible in git, though a bit convoluted. Basically the system takes a date as input and then finds the version for each file that corresponds to that date. In the Safe Network, when browsing from one page to another by date, the network would need to provide a way to map a given timestamp to a specific version, ie requested timestamp 234221341341 is between publication timestamps of version 5 and 6, so it corresponds to version 5.

However, since the network presently provides no timestamping service the most we could do is rely on locally generated timestamps provided by the publisher of each resource, which could be wrong, even deliberately wrong. So our browsing experience is not truly what it would have been at that date/time.

Of course the attractive thing about the timestamp idea is that we could relax/remove the (proposed) version requirement for <a href> links, and yet provide an even better historical browsing feature.

I suspect that eventually something will be developed similar in spirit to opentimestamp.org that operates natively on the Safe Network. At such time, use of timestamp based browsing becomes more feasible/attractive.

1 Like

To be clear: anything loaded as part of the page (images, javascript, css, audiofiles, video etc) has to be versioned or it will not load.

Anything less and the whole purpose of the page could be changed. A new CSS could invert colours and now your article that you have shared with all your pals is suddenly less than savory; new JS starts mining crypto; different video is shown etc etc.

1 Like

I was referring to this remark:

Which means that a (versioned) javascript could load unversioned content, which goes against perpetual web.

2 Likes