The economics of storage

So the longer you store something, the more it should cost. But it would be very likely that the series of costs “for all time” diverges to infinity. And then what? This is just not a sustainable model, it’s ‘wishful thinking’.

Downloading data costs some resources. Storing data to be accessed later costs some resources. Believing that “number go up” economics will solve everything is just kicking the can down the road. And isn’t this network supposed to be self-sufficient and sustainable for potentially centuries?

Even if the value of the network token goes up, that doesn’t solve the underlying scarcity. Induced demand will just bring more data to be stored, and it will vie for space in the network similarly to how gas fees are skyrocketing for the blockchain. The most expensive thing in the blockchain is arguably SSTORE extension (“store this for all time”). It’s a money loser and it’s driving the price of any blockchain up all the time as it gets more data to store. Just because the SAFE network is much more sharded doesn’t mean it won’t encounter the same economic issues.

Nobody is saying that or even considering it a valid approach.

I think this is the confusion. The $ value of the token has no impact on storage costs. They are influenced purely by the minimum storage people are prepared to put on the network. Those costs are balanced against how much people are prepared to pay in tokens. If the token value goes up you pay fewer tokens and farmers get a lesser quantity of tokens. The $ value is only relevant when you consider to total number of tokens you get. So get 100 $1 tokens or 1 100$ token type approach.

What scarcity? This is about data availability, not scarcity. Making data scarce is paramount to an attack on humanity IMO.

It’s actually the opposite, more data, more farmers, cheaper store costs. The model is not the same as blockchain models.

I suspect this trips up a lot o blockchain focussed people.

3 Likes

Also, storage costs continuously fall so data that was uploaded previously gets cheaper to maintain every year, and is paid for with a fraction of the payments to store new data. Similarly demand for storage increases all the time.

And as David said, the fiat value of the token is not relevant because it is decoupled from the storage cost by supply (rewards) / demand (fees) which adjust automatically to ensure supply meets demand, or that demand reduces to available supply.

2 Likes

The less it will cost. Cost for all-time increase, but the longer you store it the lower the average cost/time.

If you consider the range of infinite series, there are some that diverge to infinity, but there are others that converge. Safe Network is an experiment (as all new things are), so which of these is true isn’t clear as this equation is far too complex to compute. Storage costs will have ups and downs, but longer term if we are going to continue forward as a technological species, they should keep going down for some time yet and reach some fixed level in 100 years or so. The remaining cost is access, yet this cost is subsidized by the network users accessing the data, who are also farmers. Network data deduplication on Safe Network will improve over time as there are more and more chances for data to overlap with existing data mapped to xor addresses. So, over the long run there are good reasons to be optimistic that perpetual, one-time fee storage is possible.

It’s not just economics though, as explained, deduplication and tech innovation are also major factors.

yep!

The cost of storage will be proportional to demand and the “farmers market” (pardon the pun), will insure the costs are being negotiated with the network at all times. But this cost is always going to be related to underlying fundamentals not the token price (as there is competition in markets). The token price going up in value should make it cheaper for those who hold the token early, but many will simply sell the token to those who want to store data - again markets collectively working to establish an equilibrium. There are no ‘gas’ fees on Safe Network or transaction fees for the token itself, so just the cost to put data.

farmers can and will be adding space as they will be earning a profit. If more and more people put data in the existing cloud (amazon cloud for example), then Amazon just adds for servers and hardware. There is no blockchain on Safe Network and no block size limits either, so your comparison to Ethereum is apples and oranges.

3 Likes

The less it will cost. Cost for all-time increase, but the longer you store it the lower the average cost/time.

No, by definition, storing something for time intervals A and B costs more than for just A. Now if the series diverges to infinity, then you’ll have to admit that for sure the whole scheme is unsustainable.

But let’s say that storing any given piece of data converges to some number because the costs of storing it drop. What is to prevent some jokers to flood the network with huge amounts of useless data? They have to pay to store it, but only once. And from there they impose a cost on everyone regardless of whether that data is used.

Now, Freenet basically scales back the amount of redundancy of the data that wasn’t accessed recently to make room for data that has been accessed. But it’s cheap for people to just access the data once to “keep it around”. As scarcity appears, competition will occur and it therefore has to cost something to access data.

@dirvine my proposal is to keep the costs for initially storing the data, but also add costs to access the data and thereby “upvote” it to be marked as important to keep in the case of having to free up scarce resources. Now:

  1. if the case of having to free up scarce resources never happens, then no data is ever discarded, but what if there IS scarcity and competition? We will have a Freenet-like way to know what to discard (or scale back redundancy for) and by the same token, if something grows massively popular, more nodes will store it (this latter thing is already being done on SAFE, so what if something subsequently becomes less popular? This is that mechanism)

  2. If you believe that data should be free, and availabel to everyone (or to poor people) the correct approach is to subsidize it. Not to build in “wishful thinking” economics (ignoring the ongoing costs of storage), but rather model the costs correctly and THEN have subsidies like Amazon’s “free tier”. It turns out that under adversarial conditions, the “free tier” can and will be abused (such as by miners using free tier EC2 instances). So then you will have to play a game of cat-and-mouse “means testing” to make sure those you are subsidizing are really individual people and have quotas for them.

But giving everyone unlimited amounts of access for free is unsustainable, sorry if I am a bit strident in my writing here — it is because in my opinion this can be a fatal flaw economically for the network, which can otherwise be economically resilient.

It’s not a blockchain, and not nearly as limited in space as one. But induced demand is still a real phenomenon in economics, so the projecg will have to have a plan for it.

1 Like

There will most likely be a Safe copy where people will pay to download data. Time will tell which model is better. It is not bad to test different things in parallel…


Privacy. Security. Freedom

The cost. There will be reasons to try and sabotage the network, but flooding it with data is both costly and ineffective. The network doesn’t care, it doesn’t know the value of data. Data that is paid for but never accessed it ends up subsidising data that is accessed, making the network more attractive to farmers and users, more robust.

If you want to harm the network there will be more effective and cheaper ways so I don’t believe there’s much risk of the above happening, nor of it causing a problem if it does.

If you are correct that in the end some data will need to be dropped, then the design could be amended, so IMO it’s not worth worrying about this at this stage. There are too many unknowns to know how all this will pan out so we must run experiments. So why not aim high? Trying for a perpetual network is surely a worthwhile experiment even if you don’t expect it to work.

I take the same attitude when people criticise Pay the Producer (PtP) on economic grounds, with IMO ideas based in the past rather than a vision for something original, which can change the rules of the game for the better. I can imagine benefits and I want us to see what they might be. It can always be disabled it if it is found not to further the goals we signed up for (the Safe Network fundamentals).

2 Likes

Not really. Say storage costs 1/2 every year. You end up with a series 1 + 1/2 + 1/4 +1/8 and some may recodnise that series. It for sure does not tend to infinity, in actual fact, it tends toward 2. It never gets to 2, but then we must also consider this.

Day 1 year 1 it’s $1
Day 1 million it is not yet 2$

However, in day 1 million is a $ worth the same as a $ on day 1 :slight_smile: So the answer is this cost converges downwards. I think many miss this point and assume some tendency to infinite costs. It’s not.

2 Likes

Alright. By the way for the record, I meant “increasing” as in it costs more to store data for intervals A + B than just for A…

All I am saying is, in programming, I have encountered over and over the question of “should I do X, or Y” and the correct answer was almost always “you should design a system that implements both, and then implement a configuration that lets people interpolate between X and Y”.

You can’t know how something will be used, so I was just advocating to model the costs correctly. (Just like in UBI vs bank loans, people introduce money into the economy by actual needs in the moment instead of an underwriter trying to predict solvency and cashflows of a business 10 years out.)

And then if you want to subsidize something costly, then explicitly do so with a “free tier” and deal with adversarial attacks on that free tier explicitly. That way SAFE network will be impervious to economic attacks, regardless of what the prices of storage become. Each concept will have been modeled correctly and addressed with a mechanism explicitly designed for it.

I have seen decentralized systems paint themselves into a corner with early design assumptions (Ethereum with Proof of Work blockchains, Black Lives Matter by not including the word Also as in BLAM) and this has led to a lot of derailment (Ethereum high gas fees cause pushback preventing adoption, natural rejoinders like All Lives Matter cause pushback preventing adoption) etc. So I am trying to highlight it early.

But this is an experiment and the F in SAFE does meen Free, so :man_shrugging:t2: Maybe the costs will work out!

Edit: wait no, the S means Secure. F just stands for For.

2 Likes

Depper diving or investigating. There is no argument or no winner here. Just intellectual curiosity, and that’s magnificent.

Even this is not necessarily the case. Say to store B you need to prove you have also stored A by way of partial or full order i.e. to prove B you need A. (there are more ways, see below)

Or blockchains need to have stored all states before the current block (pow or pos etc.). So storing B without A makes storing B worthless. (linked to a totally ordered system)

So, taking that even deeper we may say, OK I meant A and B are totally independent as a way to find a way that storing A+B is cheaper than just storing A. Then we need to ask what is independent of what? Turtles all the way down.

The answer is, as is much of nature, it depends on the surrounding assumptions and functions what is cheaper, workable or possible. So a blanket statement won’t work.

In the case of Safe storing A shows you are capable and trusted to be paid to store B.

The answer is generally deeper and more interconnected with the whole organism/network or project than it first appears.

1 Like

It seems like the cost for perpetual storage would end up about 5x the current annual cost. See this post for the data and calculations. Historically every 10 years there’s about a 10x reduction in cloud storage cost. Storage costs seem to be convergent, not divergent.


On a bit more of a social angle to free GET, I’m currently reading The Gifts Of Athena [2004] and there are some pretty relevant quotes

Progress in exploiting the existing stock of knowledge will depend first and foremost on the efficiency and cost of access to knowledge.

When the access costs become very high, it could be said in the limit that social knowledge has disappeared

If access costs are low, the likelihood of losing an existing “piece” of knowledge is small.

Access costs thus determine how likely it is that new discoveries and knowledge will be added [to the overall knowledge base] because the lower access costs are, the more knowledge will be cumulative.

Access costs, however, depend not just on technological variables. They also depend on the culture of knowledge: if those who possess it regard it as a source of wealth, power, or privilege, they will tend to guard it more jealously.

Nature poses certain challenges and constraints that matter to the human material condition, and overcoming these constraints is what technology is all about.

To be sure, engineering knowledge during the age of the baroque had achieved some remarkable successes, and besides Leonardo a number of brilliant engineers and inventors are known to have proposed precocious devices: one thinks of Cornelis Drebbel, Simon Stevin, Giambattista Delia Porta, Robert Hooke, Blaise Pascal, and Gottfried Wilhelm Leibniz, among many others. Yet obtaining access to their knowledge remained very difficult for subsequent rank-and-file engineers and mechanics, because it was often presented to a selected audience or never published. The Enlightenment began a process that dramatically lowered these access costs.

The significance of the information revolution is not that we can read on a screen things that we previously read in the newspaper or looked up in the library, but that marginal access costs to codified knowledge of every kind have declined dramatically. The hugely improved communications, the decline in storage and access costs to knowledge, may turn out to be a pivotal event.

Another relevant source is the safenetwork.tech fundamentals, free access is one of the fundamental values of the network.

Allow anyone to have unrestricted access to public data: all of humanity’s information, available to all of humanity.

Let anyone browse content anonymously and free of charge

It is crucial that the new decentralised web is without barriers. One of the most important foundations for a global, collaborative platform is that anyone can access public content for free at any time without the need to create an account.


There have been many discussions about how it’s ‘not economically feasible’ to have free GET. foreverjoyful and the ‘sustainability concerns’ topic in particular comes to mind.

The thing that especially provokes me on this point is projects like Filecoin can’t give a simple link to ‘try out filecoin’ because to try it out you need the coin, so there’s a huge amount of friction compared to clicking a link or opening an exe. I’ve never seen a filecoin link in the wild, but I’ve definitely seen ipfs links and onion links. Free GET is essential for adoption. Economics is about creating and delivering value, it’s not just about having a token that enables everything to be precisely accounted for.

I agree bandwidth isn’t free. Egress costs for cloud providers are currently the major bottleneck for most services. I was recently chatting with a guy who works on genetic sequencing and once we both realized we were ‘computer people’ the first thing he bought up was cloud egress costs. It’s a real and legitimate problem.

Zooming way way way out and thinking would there possibly be an arrangement of humans and atoms that reduces information transfer costs so low that it’s free, I’d say yes, it’s absolutely possible we could lay enough fibre and have enough satellites and do enough maintenance and do enough lawyering that we can deliver information to everyone for free.

This is an interesting angle on pay for GET.

I guess one way is to have all GET include a fee but some requests set the value to 0.

Or another way is some requests have no fee included, and some do.

Either way it seems like adding fee for GET makes quite a bit of extra work for nodes, either extra bandwidth to include a 0 fee, or extra computation to filter fee vs no fee.

The simplicity of no fee for the most popular and basic action seems pretty nice to me. I’m not really convinced that pay for GET is a good option.

6 Likes

Free speech and free knowledge… I personally think a pay wall would be very harmful for both adoption and the goal of the network (at least what I think the goal should be…)

… And even if it turns out at some point that the assumption of converging storage costs really was wrong… Wouldn’t an upgrade still be possible later on…?..
… No need to solve problems that most certainly don’t exist… Just to be sure…

1 Like

When it comes to storage, it would be fun if it was a challenge for scientists to create better DNA storage. SAFE storage should push the usuage of all the dorment harddrives and new innovative storage tech. Storage means uploads, which hopefully will result to faster internet connections. Ideally you want to fork data from clearnet to the SAFE Network skipping uploads entirely for clearnet data.

I think with exabyte scale storage, SAFE directly becomes usefull for enterprises and scientific research. The priority should be to get as much data onto the network asap, because this will make it valuable for humanity. Storage in a way is also a balancing act, to use as little SAFE as possible to battle test the SAFE Network and prevent early testers in the first few years from loosing significant amounts of coins due to unforeseen… It can become an attack vector if SAFE storage is too low, because then it depends on exchanges or fiat to measure it, while it should be measured preferably in scientific apps and pushing innovation.

Just my clueless pov