I thought I might make a topic on this because it has been bothering me for a while now.
the current internet works usong centralised servers which have high bandwidth. The idea of the safe network is that it’s stored on the computers of the farmers. However, compared to those servers bandwidth will probably be really bad.
Sure, because its decentralised the amount of requests will be lower and less bandwidth per farmer is required compared to centralised servers. But given that the global average upload speed is a whopping 12Mbps I can’t imagine the network being fast (even when the file is stored over multiple farmers). At the very least it will make the internet very unstable (sometimes fast when you connect to fast farmer and sometimes slow when you are connected to a slow farmer)
Well kinda, but it’s getting parts in parallel is the key so any initial latency is spread across many machines. Here we ensure a minimum number of machines hold any piece of data. So no missing pieces etc So like bitorrent where you are guaranteed seeders are on line if that makes more sense.
When you are going to load a page with less than 1MB, you will get that from the fastest one with that copy ( 8copies ). If the fastest ones have only 10Mbps, than it is about 1sec.
If that site have 10MB, than you are going to connect 10 farmers and it will still wait about 1sec to load all content, if all 10 chunks (1MB parts of whole site) store vaults, where fastest vault of each single chunk have about 10Mbps or more and you have about 100Mbps download speed.
And in a case of millions user going to load same page at same time as you, there would be some bottleneck, but because some other vaults get reward to make immediately another copy in own cache base on that high demand, there is no bottleneck even for this case.
The farmers are racing to deliver you same copy as fast as possible. So lowest latency to you and highest bandwidth wins.
With better global Internet speed the loading times will be lower than would be today.
That number surprises me. In the USA, with Comcast, which is one of our major internet providers, I get 100Mbps download speed … and only 5Mbps upload speed. I don’t think there is any option to increase the upload speed, for any amount of money, on any of their residential plans. (They do have a business plan that provides symmetrical download/upload speeds for $300 a month, I believe, but nobody uses it, for the most part, except actual offices.)
I live on the East coast of the US and get 12 Mbps upload from Comcast. This was true back when I had 150 Mbps download and most recently when I upgraded to 350 Mbps. I wonder if you could upgrade your modem and get a little more out of it? I did recently and went from 250 to 350 Mbps download without a change in service just a better modem.
To be honest I would not care if the write speed of SAFE was 10X slower. As long as read speed is within human tolerance. I suspect it will be pretty fast, but I am much more interested in it being secure, private and accessible to all.
Then we can get into speed tests and competition. Overall the price is worth it. I tend to think of it like this, it’s faster to go in an out of doors that have no locks, but you would not want to put valuable stuff in there. Right now it’s worse, the big companies have the locks and we sometimes are allowed to get in.
Don’t forget that with caching vaults the number of available copies could be very high, and offer low latency. Caching (for read and write) has the potential for huge performance optimization. Imagine the difference between using a page cache vs no page cache on your local OS.
Mate, I understand you may get frustrated at people at times, but please keep it classy. You are running a team and supporters (like me who are stalkers and not big posters) like to see you keeping it calm and collected and in control. Keep crushing it and know there are more supported than you may see from this forum that post. Can’t wait to see what you do in the coming months. Those that post too often pressuring don’t understand the pressure to build and even more how hard it is to deliver. My 2 cents.
I do a lot of home hosting for myself and services I run for customers, and I have way more bandwidth and hardware than I will ever need. I have 2 network lines running separately (all individually backbone cabled directly from the switches to a colocated rack), and they’re mostly redundant. All of the cables support 10gb/s (the speedtest is around 1gb because the internal rack networking is 1gb) so I should be able to add 10gb/s network capacity at launch by utilising the redundant line.
Perhaps this is a “britishism” but his comment was perfectly classy, I didn’t really sense any frustration. Sometimes people say “to be honest” as a retort, sometimes people are just being honest.
Plus: it’s perfectly valid, the upload speed isn’t all that important. The majority of users are downloading almost exclusively, the current internet’s upload/download numbers are incredibly asymmetric and I wouldn’t expect the SAFE network to fall far from that trend.
Content creators don’t mind waiting an extra 10 seconds to upload a website, or a blog post, or whatever because it’s small fry in terms of how long the content actually takes to produce. As long as the download speeds are palatable, we’ll be fine.
Yes, but isn’t one man’s download another man’s (mens’) upload? Shouldn’t we still be concerned with how fast the farmers can upload the requested info? Caching, I understand, will help this situation. I suppose only by thoroughly testing and measuring the entire data itinerary will we have a good idea of how efficiently the requester/farmer/cache system is working and a true test will probably not be available until the network has some maturity under its belt. I see a lot of tinkering with the caching capacity in the beginning.
It’s important that we make a difference between upload/download speeds and write/read speeds.
It can take a while to write something to the network (because you have to wait for the data to be dispersed around several nodes and for the nodes to agree that it’s a valid write, etc).
This is independent of upload speed, which is the raw speed those nodes can deliver the data at once they agree it’s a valid piece of data.
It’s possible to have an incredibly fast upload speed, where people receive data within a second, but for the write speed (how long it takes for new data to be available to the network as a whole) to be quite slow, based on the algorithms used for reaching quorum.
Hopefully, the early adopters will resolve that problem. Most of the people here will be running home vaults (because who doesn’t want to earn safecoin at launch?) and providing far more bandwidth than they use in actual browsing.
Most people with a home fibre connection are generally incapable of saturating that network for more than a few minutes at a time, take the case of the 36mb/s UK home fibre packages offered by BT:
A home user has a 36mb/s network connection
The home user decides to stream a TV show at 1440p (which is unlikely anyway because most browsers are still 1080p, but this is hypothetical) at a bitrate of 6mb/s over SAFE
They’re also browsing Reddit on their phone and downloading photos and articles at a bitrate of 1mb/s
They’re currently using ~7mb/s
They’re able to upload 29mb/s from their vault
They’re able to provide data for ~4 people (as a worst case scenario, most people don’t stream 24/7) even when using the network themselves.
So you can see here, the average early adopter will be able to contribute more than they take.
The issue comes in when we get an influx of brand new users because of hype, at that point we have to rely on the power vault users. That’s people like me who can supply a 10gb upload connection, but lets assume the power vault users (who will realistically be the people providing the majority of uploaded data) only have a 1gb/s home internet connection (which with Hyperoptic / Telcom in the UK and things like Google FI in the US is quickly becoming the reality), the equation looks like this:
A power vault user has a 1000mb/s connection
The user is streaming in 4k at 60fps at a bitrate of 25mb/s via the SAFE network
The user is downloading a game via steam on the clearnet at a bitrate of 30mb/s
The user is browsing Reddit at a bitrate of 1mb/s
The user is currently using 25mb/s on SAFE and 56mb/s in total
The user is able to contribute 944mb/s back to the SAFE network
They’re able to provide data for ~157 people (as a worst case scenario, most people don’t stream 24/7) even when using the network themselves.
In a best case scenario of people mostly browsing webpages, they’re able to provide data for ~944 people even when using the network themselves
The reality is probably somewhere in the middle, where power vault users are able to make safecoin providing data continuously for around 500 people on average given a 1gb/s connection.