NRS Brainstorming Megathread

You have not described what that could be that requires recording every damn ting at cost.

Coupled with @tfa idea of not allowing random, is bizarre.

Your idea of recording data to private cache, is forcing an oddity (though I note about you have suggested they could delete it - if they are sure. If they can delete it if they are sure, why could they not sign up for this feature if they are sure?).

tfa’s idea is “just prohibit usage of random function in javascript”.
Common sense is breaking through :slight_smile:

Those are not choices or features that the user has signed up for.

Yes!.. that’s fine, expecting that necessarily includes the option for v=latest

My main concern is based on the inability for browsers to always maintain backwards compatibility over long periods of time. Can you really view web pages, for example, now using Windows 10 the same way they appeared in the original version of Windows in 1985 or even Windows 95 ten years later? If not, what is the purpose of clinging to the insistence of permanency in viewing? It will always be dependent on browser capabilities and the continued existence of supporting tools and plugins required by the web page won’t it?

1 Like

:+1:

That’s a good thought and insisting on web standards like http headers might be a good idea; if you are wanting perpetual web, that’s rather important.

So, like of

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" "http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
or html equivalents.

You will understand that I don’t agree because this goes against perpetual web and allows to rewrite history.

But this not a reason to not implement perpetual web. I guess there will be technical solutions to solve this problem, like a legacy mode in SafeBrowser.

1 Like

You get that with a private/public versioning system.

For example:

Visit cat.food/version=1.0.0.0

The sight has dynamically generated random content and you clicked “Enable full permaweb” in your browser options so an exact representation of the page you see get’s versioned to cat.food/version=1.0.0.1 in your private account. A full 10 years later cat.food/version=1.3.0.0 is the latest public version of the page. You want to see the page again exactly how you remembered it so select cat.food/version=1.0.0.1 for the recollection. Next you want to compare the old public page to the latest version so you one version increment back to cat.food/version=1.0.0.0 where the random function causes something new to happen. The outcome from this new randomness gets versioned to cat.food/version=1.0.0.2 so you can check back again in 20 years to see the two random images you previously generated.

Edit: An x.y.z.w versioning scheme is easier to follow than the w.x.y.z scheme I first described.

erm not without reasoning I won’t.

Why should external links need versioning?
If you want to track that hard, then invoke Timeshift… and noting @jlpell is evolving to suggesting “Enable full permaweb” if you want, which is quite right… that’s a choice for the individual user.

The only reason I’m arguing this, is that I don’t see that it’s reasonable to force aging on other people who have simple lists of websites, where they want to list links to the latest version.
If you are talking about what JavaScript is linking to not having an option on v=latest, then by inference you are demanding the same of non-JavaScript links… and for why?

I’m not evolving, just making what is obvious more explicit for you so you don’t complain that I left something obvious out.

It is now more apparent, and appreciated. :wink:

Just a thought but the concept of linking to a domain, is distinct from the concept of linking to an instance of it.

As far as I see, we have good sense summed above.

  • The option for a user to choose Timeshift, can solve any fair interest.
  • Those who want to cast their own data as permanent can be careful how they approach use of links, supported by web tools for that.
  • It should be expected that a page will be stable and then will link to its local elements with exact version that is not latest.
  • External links can be versioned or not or v=latest, with the not or v=latest being the same outcome.

This I wonder will suit normal user expectation as they transition from unsafe web and should cater interests going forward for perpetual stable pages that can be looked back to.

version=latest seems like a fine option.

Consider cat.food/version=1.0.0.0 where there is a link to the image mouse.jpg-v=latest. When the image is fetched the browser notes that mouse.jpg-v=3.1.4.0 is the latest version of the image. This actual version number gets placed in a cat.food/version=1.0.0.1 that gets saved to your private data as an exact record of what you viewed for future reference.

Ok… but I can’t see the added value to stating that… if the link has no v=latest, then it defaults to latest.

The only advantage is if it’s thought simpler that the rule is “every link needs a version” and the version=latest, is a get out.

The downside is it’s a pina for copy pasting existing unsafe sites onto safe… but fixable.

Seems fine too, but that should get replaced by an explicit version number when the private versioned snapshot is made as per the cat.food and mouse example above.

if you have snapshots turned on… which I’m sure you will!

Or maybe I just won’t click the “Disable full permaweb” option. :wink:

You won’t be clicking that… because it will not exist. :stuck_out_tongue:

I like forgetting… and too busy with the present to ever have time to wonder about what I saw in the past that I forgot that I didn’t need. If I’d paid for it, perhaps I would have reason to wonder!

Small example:

  • in 2010:

    • NRS map A at version 0 contains index.html with a paragraph displaying a text from NRS map B at latest version
    • NRS map B at version 0 (latest one at that time) contains text “Global warming is not created by human activity”
  • in 2020:

    • NRS map A at version 10235 contains index.html still with a paragraph displaying a text from NRS map B at latest version
    • NRS map B at version 1 (now latest one) contains text “Global warming is created by human activity but not by oils industry”

I see the new site position on the matter but I remember vaguely that they didn’t say that in the past. So I select version 0 of the site but I am surprised to see again the new site position.

In short, history has been rewritten and this not what I expect with perpetual web.

The problem happens because of the reference to the latest version and wouldn’t happen with true numerical version.

2 Likes

Ok, so that’s a good example and makes the case for JavaScript needing explicit version.

Expecting still that non-JavaScript external links could be without version or v=latest. … or could that be used to create the same problem?

Could one answer be some form of url link that allows only for linking to the domain and unversioned of that is ok? safe-domain://abc

1 Like

Yes, that could be used to create the same problem. The control has already been implemented for non-JavaScript resources and @joshuef was clear that this is definitive.

But he wasn’t so definitive for JavaScript and this is what I try to argue for.

Imo the cost will/would be negligible. In the example above the only thing that would need to be saved to the private version cache was the link reference for the specific version of the mouse.jpg file. What is that… 64 bytes or so.

Do the math: 1000 pages visited per day @ 1k data stored per private version is about about 365MB per year. That will probably cost less than $0.03 USD in 2020 USD. And the cost is always dropping.

These are very rough numbers but show the order of magnitude.

Do that for 80 years and you have a lifetime of digital experience stored for less than the price of a cup of coffee at a boutique coffee shop.

Pay per visit, rather than the volume of data over a year… :thinking: