Old Web Hub: Open Access to Old Web Content

Secure access to web content without Tor or VPN etc (inspired by http://twitter.com/Sci_hub):

This is an App that takes a URL from the old web and looks for it on SAFEnetwork. OK, it’s not there right! But wait… it then offers to access it from the old web and store it on SAFEnetwork, where it will then be available if anyone else tries to access it (using this App).

It’s not trivial - for example, who pays to store uploaded content?

Also, any dynamic content would potentially be quickly out of date, so this might need to be detectable, and an option provided to have it updated (also to browse earlier stored versions).

Neat huh? :slightly_smiling:

(BTW The inspiration for this comes from http://twitter.com/Sci_hub which does this for scientific papers which have been pulled from behind prohibitively expensive paywalls and made available free to everyone. Most papers are already available, but when one isn’t, Sci hub automatically goes and finds it using keys donated by people who happen to have access, and want everyone to have access for free.)

10 Likes

The real open web. No more “walled gardens”. This could be really cool.

Just to be a little more subversive: I could see some browser/safe plug-ins for a wiki-leaks type upload (or download - depending on how you view it) too :wink: But the places that those “secret” things are kept would never allow plug-ins of this kind.

1 Like

Wouldn’t it be possible for outproxies to be established. Of course the SAFE exit node could be seized and attacked much in the same way Tor’s exit nodes are, but that hasn’t stop them so far. The greater level of security and the sybil immunity make IMO it a much better option to do it with SAFE.

As for payment for your original proposal, it could be pooled together by interested parties. Say I want to import foopocks.com. I then post my desire to have it imported on the relevant app by simply adding the url to the app. The app informs me of the estimated cost. I tell it the maximum I’m willing to pay. In time when others add the same url to the app, both parties are informed of the growing pool of investors, the amount both would have to pay, an estimate of how many others need to request the import (join the pool) in order for the individual user to meet the amount the user can afford.

So it’s simple.

You open the app.

It tells you to paste the url.

It submits it to the decentralized database.

It informs you of the url import status (i.e. how many others have requested it, has it already been imported, site elements that need updating, how much it would cost to import it, etc).

Gives you the option of chatting/messaging others who also requested the url import so that you may negotiate how much each will pay.

The app can then gather the total coin necessary to import the site from the investor pool.

Downloads the site then inform all interested parties.

It could also be designed to take a tiny portion of the transaction to compensate the developer.

Sound good?

5 Likes

Could this copying of “walled garden” data be a way to port people over from existing apps like facebook, etc, until they got to critical mass and full adoption happened? My mind is in brainstorm mode… Have to think on this one a bit.

2 Likes

Very nice line of thought @chadrickm. So good to have you back :slightly_smiling:

While pondering SAFEpress I was thinking about an App that could automatically copy sites from the old web, but since reading about Sci hub the idea seems even more attractive.

It would also be much easier to work on while so many things are unknown with regard to creating dynamic websites on SAFEnetwork.

Maybe we could get some momentum behind this @chadrickm? I have zero time right now, but if you have enough to keep the pot simmering maybe we can cook something up.

Anyone else inspired by this?
@joshuef @mvanzyl

2 Likes

Like I said in the other thread, this could be some huge marketing if we found a way to “sync things up”. I would be concerned with anonymity though. It’s something we would have to think long and hard about. I’m willing to put some brain power behind the idea though, if not actual code (but something like this might persuade me to put fingers to keyboard and put out some code).

1 Like

What are the anonymity implications? With voluntary outproxies importing the data, the requester’s are SAFE. :smiley:

I’m thinking of how/if we would tie entries to users when they come over to the light side. I would think we would want to keep access to these posts the same level as they think they have them now.

Real privacy and security of the user’s data entries, if that makes sense.

1 Like

Hmm, I’m not sure what you mean. Anything a user does on the clearnet is subject to surveillance. If they want to upload a clearsite to SAFE they need only apply for it via the app suggested above. Once all of the previously stated conditions are met, the app then proceeds to find a viable outproxy to grab the data from the clearnet. The outproxy assumes all of the risk but of course is protected by plausible deniability. So basically anonymous users pay and the outproxy stores the copy of the clear site for them. Am I missing something? Kinda likely considering how I torture my brain. :stuck_out_tongue_winking_eye:

2 Likes

I’ll try to sit down tonight and get my thoughts straight. I’ll post it once it’s clearer.

1 Like

Outernet and maidsafe seem like a perfect fit from this perspective.

1 Like

This could be the killer app if done right

2 Likes

Yeh, wouldn’t be too hard to pull off I guess.

Funding said server would be a thing. Initially I thought about having the external server do a full crawl, but as @Tonda notes, you’d have no way of knowing how much content you’d get (and therefore cost).

So maybe just discrete URLs? (then of course there is the question of images… you could leave them to be interpreted by the browser. But then, that’s not SAFE). Which is another cost concern. I guess youd have to aggregate the PUT cost across all requests…?

I need to look at the API examples to see what’s what these days. I haven’t had a look since September, but I’ve been meaning to get back on the horse as the MVP approacheth.

I was wondering: would these sites need to reside with an owner? I initially thought yes, but then: who can prove they own any site?

Also: anonymity, as @chadrickm noted. To what level should safe not import scripts (tracking… adblocking). If we just GET the raw site, it’ll come with all that baggage. Though I suppose it’s not any different to just looking at it in your browser normally.

But if the point here is keeping it secure (and why else would you be checking this via safe), this would be needed.

HMMMMmmm. ha. Interesting though. (sorry for train of thought post :stuck_out_tongue: )

Thanks for the ping, @happybeing; you’ve inspired me to get back at reading API examples!

2 Likes

Well worth it then! And thanks for your comments. Lots of issues to think about aren’t there. I think it could start very simple - but with some ideal goals in the back of the mind - and then listen to feedback, wants etc.

I’ve been playing with web scrapers recently so know there are some really good tools that would make it easy to prototype this aspect.

I ran a web scraper back in the day from Carnegie Mellon (a friend of mine went there). We were essentially pulling sites and getting specific data after parsing (something of what Google was doing at the time). Eventually the site we started with blocked us because they thought we were attacking them. My friend didn’t get into trouble but we were worried for a time. Fun days :wink:

1 Like

Yeh there’s a real concern for whoever runs the scraper. Even if it’s just doing a cURL for the page content.

What’s that content? Whoever runs the scraper’ll be liable there.

I guess ideally it’d be proxied again through TOR or something. At least until it’s on SAFE.

1 Like

Posted my thoughts as promised here. Wanted to make it its own post so we could wikify it… @happybeing if you want to do something else with it, feel free.

1 Like

Basically , we speak of something like archive.org on the SAFE network . One should be able to
donate automagically to such while farming , that is , giving part of the reward to that mission …