I have some beginner questions with regard to the storage mechanics. I probably got parts of it wrong, so feel free to correct me.
(1) Let´s assume I have a 10MB file that I want to store on the SAFE network.
- The file is encrypted and chopped into 10 chunks, each = 1 MB.
- Each chunk is stored on 4 different vaults, that means that my initial file is now distributed over 40 different vaults.
- Whenever I send a GET request the file is recomposed from the 10 chunks, depending on which vaults are online.
My question here is: what happens if 4 vaults go down which maintained the exact same chunk. In this case I´d expect that it is impossible to get the file? Or is the file pushed to another vault when one vault goes down?
(2) If it´s true that if vaults are down it is impossible to get the entire file, does that mean that large files are exponentially more dependent on a stable network?
(3) I have a 100KB file. How is it chunked?