Demo App allowed me to put together a some safenetwork test pages. That was cool.
When I try to put up some HTML that includes a little Javascript, the javascript won’t load. Even if I embed the Javascript directly in the HTML, I get console warnings that the site’s security policy blocked running the code. I can see the Javascript loaded just fine in the source.
Has anyone had luck putting up a safenet page with Javascript? Did you have to do anything to convince your browser to run it?
Related, none expert suggestion: Cross Origin Resource Sharing (CORS) is something that is controlled, to reduce vulnerabilities, in a way determined by Content Security Policy (CSP).
So MaidSafe define the Content Security Policy to allow safer Cross Origin Resource Sharing (which in this case I think is largely same origin - so when the browser does it’s pre-flight CORS checks they fail for inline scripts (since these can easily be used to defeat the same origin CSP).
So your suggestion to put scripts in linked files is correct. Inline CSS is disallowed for the same reasons.
Probably not exact, but something like that!
Edit: Technically, I think this requirement goes away with SAFE Browser, because the CORS/CSP checks were only done by the proxy, but I think it’s good practice so should be continued, and is gradually becoming the default for all web browsers because it is needed in the clearweb.
I’ll give it another go: original attempts did include both linked and inline JS. Hadn’t tried a relative directory reference (usually I like "/foo/bar’ rather than “./bar” for whatever reason.)
Thanks for the tips, will follow up if I can confirm something working.
Looks like the problem is the proxy interfering with URLs. If I have a complete url “http://foo.bar.safenet/js/script.js”, it strips the “http://” from the front, and the browser tries to find “foo.bar.safenet/foo.bar.safenet/js/script.js”. If I use "./foo/bar.safenet.js/… " the proxy actually adds an extra “foo.bar.safenet” to the front, which similarly breaks all relative links.
Those changes are visible in the difference between “view source” on the browser vs. the code I actually uploaded or download for text editor viewing. Seems like the above is guaranteed to break linking between sites. Going to make sure I have the most recent version of everything because it seems unlikely that a bug like this would have been overlooked.
I saw the same with you on the test set; no evidence of my urls being specifically altered by the proxy, http:// or safe:// protocols being handled (or not) as expected between Firefox and Safe browser.
After wiping everything and starting fresh, and further experimenting:
My first attempts to make things work were a bit haphazard, so my interpretations before I put the test set in place weren’t based on coherent data.
I have to re-map the Safe DNS to the files every time I make a change in order for it to propagate the changes.
At first I was poking at a lot of things at once, using a code generator and a CDN for JQuery, etc. Once I settled down on checking one thing at a time and paying attention to the results it worked out. The proxy does not alter URLs. Safe DNS has to be remapped if the directory contents are changed.