How fast is safe network

I know this is like gazing into a crystal ball but let’s say when vaults from homes are released, is network expected to be faster than what it is currently, or slower.
What is expected latency in a world wide network without caching(assuming only you access the data) is it 100-200ms or 1000-3000ms


Also is there an explanation for how network resolves data race. As an example let’s say there is an exchange in safe network and two people in the world click buy in very similar times, how does network handle such a situation

1 Like

RE performance, this is probably a good thread to check out. Obviously testnets are probably not going to give results that compare that closely to the final network at scale, but it’s all interesting.

I’ll leave the second question to someone more technical, even though I kind of know the answer. I don’t want to confuse anyone with amateur interpretations :wink:


How fast? Fast enough.

So we can all stream 4k on safe network?

exactly what I wonder

Can’t speak to 4k, but I was certainly able to stream perfectly on all the testnets, even the earliest ones. I’d expect the network to perform better at scale and with optimisations that are constantly being added in each iteration of the network.

RE data race, since no one else has commented I’ll have a quick stab then :wink: Think of it less like competition to get an entry on a ledger and more like signing over ownership of the data. When a group of nodes declare that you own a piece of data (like SafeCoin) it is done in a fraction of a second. There is no blockchain/ledger to have to update and get consistent. The tx is instant with no confirmations required, no fees, no PoW and it is totally private because it is not recorded on any ledger. If you try to ‘double spend’ on SAFE the tx that gets there second will just fail instantly.

I’m sure there is a far clearer way to explain it for technical folks, but that’s kind of how I understand it as a layperson :wink:

1 Like

Think BitTorrent without the dependency on a high number of seeders. 4K videos are huge therefore so are the number of chunks. Latency of first click to actual stream start could be ms to secs but from there the speed in most cases is only limited by your connection.


IIRC data races are handled as first come. Churn makes this difficult though. Section messages require aggregation of votes from pre-established nodes. Those just joining are left out until consensus is reached to validate their participation in the section. Data chains will likely use a similar mechanism for splits/merges. That’s a tough one though. :thinking:


This post and subsequent ones in this thread may help re race conditions. Seems that data chains will be part of the solution.


If your download link can handle 4K then there is no reason a properly setup viewer cannot play that and more.

Think of disk and internet as essentially serial devices when data is transferred over the interface, you can only read/get the data sequentially one byte after the other. Whereas SAFE is more a parallel access, it will get the requested blocks concurrently (in parallel) and since the chunks are coming from different parts of the world the chunks will arrive at your ISP around the same time and not down a particular link sequentially. So any delay is in getting the first chunk, then the rest should be at your ISP waiting to travel the last “mile” down your internet link which again is essentially serial only.

To play 4K is not dependent on SAFE as SAFE will give you your chunks faster than you can handle them (after init delay) and it will be up to your internet link speed and the player in how it requests data.


When you look at a large torrent file downloading, you see it getting chunks from random locations of the file, in random order. I assume SAFE will work the same way?

For streaming, you don’t want to waste time downloading anything that’s more than a few seconds in the videos “future”. from your current location in the video. Is there a way to tell SAFE to load the data file front to back, or does it do that automatically? Bonus points if there’s a way to actually pause downloading after you have 10 or 20 seconds buffered, so that the network isn’t required to send the whole video to people who may choose to pause or stop it midway. More bonus points if there was some incentive for people to write their apps like this. Currently, I think app developers would just download all the data, since there’s no disincentive to deter them from loading the network unnecessarily.

1 Like

I doubt it very much. For one you need the chunks in order (the client does) in order to decrypt it.

Torrents are just trying to download a file and don’t care for the order so the torrent software requests the file in multiple sections to speed up the process. Still a form of parallel retrieval.

For SAFE the client video player APP will be requesting the file in order and if its done right then requesting blocks in advance which is what will give you the “parallel retrieval” operation. A good video player buffers up the video so that any delays in the internet link will not interrupt the video. So for 4K playing then the video player will be requesting quite a number of chunks initially and then requesting more so that it keeps ahead of the part playing.

So maybe it requests 10 blocks upfront (client GETs chunks to fulfill it) and then when one block is played the player requests the next block.

Now if you were just copying a file from SAFE to your disk then you could have a torrent style of getting multiple sections of the file at once and decrypt it at the end. But a video player is buffering sequential blocks and if all goes well only the initial buffering will see or need the “parallel effect” since it might request 10 or 20 blocks and that equates to 5 to 40 chunks.


That was exactly my point. How do I insure it will be delivered sequentially? Or if that is indeed the default, in which case, will there be a way to allow bit-torrent style downloading for faster speeds?

That is up to the App. If it asks for one chunk at a time then its gets the chunks in that order. If it asks for 10 chunks at once then its up to the APP to keep its buffers in order. This is the same for file download apps that ask for more data than fits in one packet. The application and/or protocol stack has to place the packets in the correct order into the buffer. Now this happens with the current protocol stacks all the time now, including browsers. Client/SAFE is just a protocol stack as well so if you ask for 10 chunks to be placed in 10MB of buffer space then the chunks will be placed in that order.

Now for a torrent style of file downloader it would be splitting the file up into say 5 or 10 sections (like torrents) and treat each separately. Then when the file is down it asks the client to decrypt the received data.

A video app or most apps they will simply rely on the client to provide the decrypted data into their buffers and the client will handle all the organising just like tcp/ip protocol stack does for packets that can come down out of order.


Sorry to be thick, but again, how? Saying “asks for one chunk at a time” is just a fancy way of saying “read sequentially”. It doesn’t really add to my knowledge.

You have an mp4 file that I want to watch. You saved it as a normal file. If I attempt to read that file with the SAFE browser in an HTML5 player, I do not believe I will be able to begin watching it until it is mostly completely downloaded. Please correct me if I’m wrong.

So, I need to write my own javascript player, that will, what? Get the datamap and then request each chunk sequentially, and then pass that to the HTML5 player somehow as data attributes or something? Just trying to nail down exactly what needs to happen.

1 Like

The first request for any file on the safe network is to get the ‘datamap’ file. This contains the list of 1 MB chunks the file has been split into.

The app can then choose to either

  1. download the first chunk, then the second chunk, then the third chunk etc sequentially and only after the prior one has completed
  2. download the first chunk, simultaneously the second chunk, and simultaneously the third chunk etc, until the client bandwidth is reached. Then the app waits for any chunk to finish to free up some bandwidth, then starts downloading the next chunk.
  3. simultaneous download but with chunks in a random order rather than sequential (which sort of naturally happens from point 2 anyway since chunks will finish downloading in different orders)

The motivation for an app to download chunks in a random order isn’t really clear to me, and simultaneous sequential download seems close to an optimum strategy to me.

So to absolutely ensure sequential download a chunk must be completely downloaded before the next chunk is started. But this would greatly restrict performance. Some degree of parallel downloading will certainly be a better strategy for performance.

This is separate from the question of ‘can the app use the file in a partial state’. Some can, some can’t. Depends on the file.


i would rather add a “4.” option, downloading in order of the least distributed chunk to greatest distributed chunk (+ rand for chunks with equal distribution). That’s the way BitTorrent works. This maximized the probability of using all of the available bandwidth.