Competition on the Field

Could anyone explain the advantage a browser such as MaidSafe offers over traditional ones such as Mozilla, Chrome and Explorer? How is the user experience improved from having a “decentralized” network underpinning the surf?



It isn’t so much the browsing experience that is improved, it is more so the privacy and security aspects. This is especially the case when it comes to saving some data to the network. Your data will remain private unless you explicitly publish it. Contrast this with the existing web, where the website you are visiting:

  • Sees everyone who is visiting their site
  • Receives and controls data users of the site upload
  • Likely shares this data with 3rd parties for profit
  • Is likely to be targeted for exploitation since they hold so much juicy personal information

There is one area where the user experience should improve however that I should have mentioned. Namely, the network will keep all data in perpetuity. So your favorite site or app won’t just disappear one day (like Google is known to do with many of its less popular apps), and there will be a version history for each website that provides greater transparency etc


The more popular a site the faster the network becomes… not sure of the details behind this. If the current web has a massive amount of folks go to the same site it will slow and potentially crash the site.


This is achieved by caching. As the site/file is more popular then more nodes will be caching the site/file closer to more people.


Hmm… How much faster the cached sites are?

If I want my site to be fast, does this create an incentive to hire a bot army to download it in vain, thus putting unnecessary strain for the network? Can there be situations, where some very big companies hijack the capacity of the network just to make their site faster than competitors site? If so, would it be possible or reasonable to artificially limit the maximum speed of cached sites in order to get rid of incentives to overload the network?

The reason its faster is that on the internet as more requests per second occur the slower the server is to respond.

On SAFE the nodes closer to the requestors will be caching the data and thus not getting slower. But in fact since the caching nodes are closer to the requestor than the vault holding the files then the reponse is faster too.

Thus the reason the statement can be made that on SAFE the more requests on a site the faster it gets and especially when being compared to the internet and servers.

Well that would be an expensive exercise to get maybe an slight speed increase on access. But your bot has to be world wide on every section so as to speed it up for everybody. But what do you gain a 10% increase in first packet returned? There has been no testing to say definitely what the speed increase on the first packet response time would be. Maybe 1% or 10% or 20% who knows and is it really worth the expense of a bot net that has to position itself throughout XOR spacing.

But when you consider that SAFE is more of a parallel access compared to the internet’s serial access, then the speed improvement is only on the first packet response time as the rest will arrive much faster than you can process it.

You see the current internet says give me one packet after the other through the whole file. Now files have at least 3 chunks and the request for the file can see the request for all 3 chunks sent at once and they all arrive approximately together and its your link then that is receiving the packets for all 3 chunks. On the internet server you ask for a file and request-receive packet after packet of the file and that speed is limited by the server’s load.


So it is so that the most popular for me are faster for me? Not in a way that the popular in general are faster in general?

Well a popular site or the next great cat pic will see people from all over requesting it and on average caching will be occurring for all if there are enough people. Approximately on the order of the number of sections in the network. Thus if you have 1000 sections and 5000 people requesting then you can expect that most if not all will being cached eventually if its being requested after those initial ones. Obviously if that is 5000 over a day then caching might not be anywhere as effective and hardly a high volume site/pic either.


Most likely LRUCache type. I doubt yo will see this in Fleming, but soon afterwards

1 Like