If the data chunks are distributed what is the overhead of downloading files?
How does it compare to conventional servers?
If the data chunks are distributed what is the overhead of downloading files?
How does it compare to conventional servers?
My understanding is that for large files, download speeds should be fast when many chunks can be downloaded in parallel, but the speed on an individual chunk basis is likely to be slow due to upload speeds generally being slow for most ISPs.
Another thing that needs to be taken into consideration is how popular a piece of content is. If a piece of content is extremely popular it’ll have chunks stored all over the the SAFE Network and make it extremely quick to access. If it’s just a file that is only being accessed by one or two people it’ll not have chunks of it stored all of the network thereby reducing the speeds.
I know that doesn’t answer the question, but here is one recent topic for a possible glimpse of what’ll come with speeds - Benchmarking Testnets
If you search around the forum I do believe there are some other topics with some more numbers. If I come across anymore I’ll update this response.