NRS Brainstorming Megathread

Many Webcams in the past and maybe still do, have the ability to update a file at some location with a current image of what it “sees”. Those images can be part of a webpage code (with zero JS) and so going to that webpage allows the person to see an image. Obviously a <img src=" for each image file referenced

My using it was purely an example of what I have seen out there in years past. Just not 100 images, but 6 or 8. But I was making a point.

Every time a new page has to be added the complexity of the site increases which also increases the chances of errors and increases the development time.

Simply having the browser recognise version=latest reduces the number of pages and may even remove the need for JS on simple sites.

The same would apply for a non-JS site (static in common usage of static) that say allows the showing of a current price list that is updated by the store once a month and stored as a pdf or set of images being read by the non-JS page

Honestly I am not looking for justification of why it will not be added (version=latest) because its only a little more work for developer but really a yes or no if you will consider it or if its outright rejected

1 Like

You can link to documents such as a price list at latest version via normal NRS, just no load them directly into the page (as things stand). We could add a version specifier for latest, but I think it leads to issues as I said.

There is a balance we need to strike. I definitely agree just being able to link to “latest” anything is easier/simpler/cleaner.

But we cannot have that and a permanent web as far as I can see at the mo.

I view the js route as an effective middle way at this point, especially with more established documentation. If you don’t know HTML, there’s not much difference to a one line attr, and three lines of js (he sez as a developer… so I may well be off there :slight_smile: )


Instead of a permissive version for loading content, perhaps the browser needs to alert the user: “This page wanted to load unversioned content, this may be unsafe and may change the page’s functionality”, before allowing a load on this page.

I don’t think it’s ideal UX, and may let devs off the hook for not versioning their links though.

2 Likes

I’ll add a bit of joke code here because if we’re going to endorse the two lines of javascript to load an image we may as well be just be putting that string into an img src property. The joke code is a halfway compromise:

<img src="javascript:(function() { this.src=safe.fetch('safe://someAppendableDataWithlinkToImage.jpg') }()">

this kind of thing really only is meant for the href property on <a> tags but… maybe we can repurpose it?!

Or option 2 to bring a bit of IE6 nostalgia back:

<iframe src=“safe://someAppendableDataWithlinkToImage.jpg”>

Just some jokes. Please don’t do either of these.

Safe is a different network. We cannot have all the things from clearnet-land otherwise it wouldn’t be safe.

Ever-changing image goes against the idea of the permanent web. We’ve been building for that, if that’s not what we’re after… well maybe we need to reconsider the fundamentals.

Two lines of js seems like a reasonable compromise to get your ever-updating image on the permanent web to me.

Can we do better? Maybe. More ideas are wholeheartedly welcome.

4 Likes

Surely permanence is an option for those who want it for their own data but with JavaScript, there no exacting permanence as script can affect text and other content like images… just chosing random image or tagline, on each page load… and the network will not know how the browser actioned those either.

Safe Network being able to cater for permance, is nice but only to the point that what is there, is always there… so, individual files are permanent but what they do, depends on how they appear together?.. I can’t see how the network can sign off that it’s snapshotting beyond individual files.

1 Like

I agree with @joshuef’s points on the importance of the permanent web and that this can best begin with forcing all in page links to be versioned, with a slight increase in the need for JavaScript in very limited cases, such as displaying different images at different times.

I don’t think we should begin by allowing an in page link to specify ‘latest’ because:

  • I think we’re agreed we want a permanent web, which requires we force all in page links to specify a version
  • that means we will need tools to be developed that assist with this, and the best way to ensure this happens is to disallow ‘latest’ for the time being. If not, we’ll find people skipping that tricky aspect (when building pages or tools) and will get almost all pages using ‘latest’ and we won’t have achieved the permanent web.

In theory, we could later relax and allow ‘version=latest’ once the tooling is there and people have learned how valuable the permanent web is, but I think it will even then severely undermine permanence, and should only happen if we decide to abandon that.

There’s just too much incentive to skip finding the current version and to use ‘latest’ instead. We cannot allow that IMO, because the cost of that will be one of the fundamentals of Safe.

That same incentive - the inconvenience and difficulty of inserting and updating the current version in every link - is why it will be so important to have tools which automate this very early on.

A small point on what is a static site or not, I think it’s useful to use the existing definition because it makes it easy for people to understand how to get websites into Safe, especially in the early days, and easy to identify what will be hard to do. We can just say: build a static website and it will work on Safe, any static web framework will work on Safe, and you can’t generate dynamic websites in the same way as on the clearweb because all pages area static, so you have to blah blah.

I think if we start calling static sites which have a dynamic appearance dynamic rather than static, it will confuse developers. Much better if they know everything is static HTML, and can go look at how dynamic appearance is created using static HTML.

Content Creation and Uploading Tools

We shouldn’t underestimate the difficulty of this, nor it’s importance. This requirement will be a major barrier for those building websites on Safe, and at this point we have no tools designed to do this.

Those tools will not be easy to create because it is notoriously difficult to parse HTML.

Given that we’re starting from scratch and will have only one browser, it gives us some time during which we can use the Safe Browser as the benchmark for what counts as HTML. This might make it easier to create suitable tools for as long as there’s only one browser engine, we have only one HTML parser. So if that same parser can be extracted and used when building link updating tools we’re ok for the time being. Although it is likely to be a clunky solution.

Whether that works out not we’ll need to figure out the way we expect early content handling apps and their developers are going to achieve this and create a guide, and ready to use libraries early on, because this is such a vital thing from day one!

I expect the team will have recognised this ages ago, but it hasn’t dawned on me until now.

2 Likes

In essence though from what is happening it has the same bad properties as using version=latest. The results are the same from the point of view of permanent web.

They both would be getting the latest image. Exactly the same result, just achieved different ways but break permanent web as you are presenting permanent web.

The idea of permanent web is that you do not delete data, links always work and the previous data is always available.

version=latest and your 2 lines of JS to achieve the same results change the part of the permanent web you are presenting and that is always get the same result when loading a page.

Banking sites, shopping sites and the like will all not give you the same results from the one page because they have too. The shop or the bank are not going to generate a page just to show you your current balance but as you would agree they will use some JS to show you your current balance when requested. Yet the webpage will be exactly the same, just the transactions/balance can change from one viewing to the next.

Permanent web cannot mean always the same result for obvious reason. Unless of course you want a static web or sites that can only show the same data. But your 2 line of JS show you do not want that.

So I am still at a complete loss why version=latest is any different to your 2 lines of JS as far as “permanent web” goes.

The compromise would be to restrict the version=latest to media like images and not script or html or similar. I agree we don’t want to mix versions of script and webpage

3 Likes

More important than permanence is to make utility most useful to most people.

One size doesn’t fit all…

The option is already there for those who want to create static reference data. That files are immutable is a big step forward. The difference can be understood. I don’t think a forced fix is the way to go. Prefer always flexibility.

1 Like

You’re taking the permanent web to mean that every page always looks the same, which I don’t think is useful or what most of us mean.

What’s useful is a web that doesn’t break because some link is no longer pointing at a thing, because the thing has been changed, or gone away, or a domain has expired etc

It’s completely fine if the content displayed is different if that was the original behaviour, and that this continues to vary over time as it always did, based on whatever inputs have always determined the appearance.

Maybe we need to clarify what we are aiming at when we say the permanent web. Actually, was the original term “perpetual web”, I think that’s more accurate and what it was called in the past.

5 Likes

Yes… worth highlighting… this thread seemed to drift to wanting more.

Aye. I cannot argue with this (and wasn’t really my intention to suggest otherwise), just trying to clarify the lack of permanence in some fashion. Clearly I was not sooo clear :slight_smile:

I’d argue not so as the javascript being run is fixed at the same. Version == latest allows for arbitrary javascript changes to what you thought was one version of a page.

Exactly.

It’s not about disallowing dynamic content, so much as saying “when i visit version 4, this javascript will always be run”. It can do dynamic things, but it’s not going to do unexpected things (in terms of what js is run on the page). This still allows complicated banking websites doing dynamic things.


Ups. That’s me just mixing up the two. Apologies. And I fully agree here. :+1:

5 Likes

@joshuef good stuff.

Can you clarify if ?version is required on <a href> links as well? If so, that really does create a “frozen” web, at least for static html sites. People will more commonly wish to link to latest version of other websites I would think. Yes, JS would be possible but kind of gross for arguably the common case. A possible solution would be to allow something like <a href=“safe://cnn.com?version=23423” preferversion=“latest”>. So a browser would typically load latest (or ask user) but a scraper or someone looking at the site 10 years later might well choose to load the versioned one. Browser’s could even enable a mode setting: “browse permaweb” vs “browse latest web”.

Perhaps same/similar mechanism could be applied for images as well, to eg address loading the latest weather map from static html. Though maybe that’s going too far, as we’ve crossed the line from “things this page links to” to “things that this page is made of”.

6 Likes

Stick to your guns @joshuef, don’t back down and let others convince you to look away from a true permaweb. Everything should be versioned… No unversioned content allowed. Imo, every page/publication should essentially be managed like a git repo on the back end having every change documented, anyone able to revert back to a previous snapshot. The past is immutable.

Version=‘latest’ or Version=-1 Is a nice shortcut.

Edit: Per the weather photo example, every time that image is updated should mean a new version or subversion for itself and all pages that link to it. A u64 to store the version number is more than enough to handle a perpetual 1 second update interval.

I would also recommend a three digit versioning scheme (x.y.z == <major version>.<minor subversion>.<error correction>) to provide an intuitive measure for the relative weight of an update.

9 Likes

I am 100% with you. This topic was dangerously drifting away from safe network basics.

More precisely: everything public should be immutable or versioned.

7 Likes

No, I truly meant everything should be versioned, including private data. This has huge benefits. Consider a Safe Writer word processor app that automatically uploads changes to a document at regular intervals, our auto commits after every sentence change.

3 Likes

Agreed.

I would add in the beginning equally important will be

  • excellent tutorials explaining why and how to use versioning
  • vigilance on stackoverflow when the inevitable question ‘how do I embed the latest image’ questions come up - answering it with how but also why.
  • fostering a culture that understands why versioning matters (and why perpetual web more generally matters)

The consequence is the same, but the process matters. If the process boils down to ‘fix this apparently wrong thing by blindly copy pasting stack overflow’ then the problem perpetuates. If the process is ‘use this javascript and here’s why’ then the javascript solution is quite useful despite the end result being equivalent.

2 Likes

You’re sounding like an event sourcing aficionado :blush:
(<= guilty of the same)

3 Likes

You can use versioned private data. But not all use cases needs them, for example temporary data don’t. It would a waste of resources to force their usage in these cases.

And I am not describing a desirable evolution but just current implementation.

1 Like

That needs qualifying as control freaks will expect too much evidence of what occurred.
Obviously, perpetual immutable instances of the files as they were, is not the same as evidencing what was presented to the user.

Surely the talk is only of links… and unversioned links then simply default to the latest copy.

Math.Random() and a set of links could always spawn uncertainty of what occurred but the files themselves should immutable and versioned.

And the problem includes this Javascript thing. Is everybody out of their mind ? To be clear: to secure permanent web a script of a public site shouldn’t load an external unversioned resource.

This doesn’t mean a frozen web, it’s just that when a public site evolves its version must be incremented.

Note that internal links do not need to be versioned which means you don’t need to change all your pages of your site if one your resource changes.

More precisely: the entry point of a site (index.html) belongs to an NRS map then every references outside this NRS map should be versioned but references inside this NRS map don’t need to be versioned.

I don’t have time to find the link but @Maidsafe agreed to these principles. I would like a confirmation they still intend to implement them in the Safe Browser.

Everyone will be free to fork the SafeBrowser and relax these principles but it won’t be a safe browser anymore. It will be a browser for state actors or big corps who want to be be able to rewrite history.

1 Like