Tragedy of the commons

Has there been much discussion on this? I couldn’t find anything specific when searching so my apologies if I missed it! Given safe network storage fee would be one time and permanent, how does this prevent tragedy of the commons from occurring?

If I a farmer today store some data and receive one safecoin, then tomorrow stop being a farmer, the overall load for the farmers of tomorrow will have increased (I.e existing data needs to be stored immediately in any new vaults coming online). Existing data will have already been paid for so new farmers would have a greater upfront cost in order to be able to accept new puts to receive safecoin.

Now I know we will have deduplication and resources are predicted to continue to get cheaper over time but I don’t think that completely removes the problem if there are no persistence renewal costs?

I understand the point you’re making and there are other discussions about this elsewhere, but the quoted sentence indicates something you don’t seem to have quite right.

Safecoin is awarded to farming vaults on the basis of GETs rather than PUTs. So a new vault coming on and acquiring data to store for the network is immediately able to fulfill GET requests, and thus have a chance to receive safecoin. I’m not sure that it will be immediate for brand new vaults, what with node aging factoring in, but the point is that the more data a node is storing, the more chance it has to earn, even is that it acquired by relocation from other vaults.

1 Like

All is profit-driven. It is more profitable to play by the rules than “attack” (quit). So the whole thing will be very organically finding balance, failing only in the case if there is no benefit to be realized from using SAFE.

1 Like

Wouldn’t this lead to higher churn rates though? If I’m a farmer I don’t want to be stuck with a bunch of ancient “junk” chunks that nobody ever accesses as I’m paying to store these with no financial return. Naturally over time the amount of junk chunks on the network would continue to grow and these would only be paid for one time. How can we quantify a cost which lasts forever without truly knowing when or if we will hit a point where resources getting cheaper can no longer outpace it?

Couldn’t this be solved easily by mandating a get request for data every couple of decades or the data expires?

Also, I think the network adjusts the payment for the get based on how much reserve space is mandated … it also adjusts the cost of the put to keep the safecoin recycling going.

So if farmers go into short supply, the rate paid for gets and puts goes up.

But this just means over time the cost of network resources would go up and up? Assuming we hit a point where resource advances can no longer outpace it at least

The cost of puts may go up … depends on many things … including the value of safecoin itself, but also costs of storage space, bandwidth … then there is deduplication …

it’s complicated and it is just another experiment in the end.

There are good arguments for allowing data to die after a period of time. I don’t know that anyone has the answers here though about what will definitely work or not work. Please feel free to run some numbers. People have looked at this many times on the forum. Do a thorough search and you will find a fair amount of discussion.

1 Like

Persistent data has never been offered before, so no-one has any clue what the value will be. It could be that in a decade no-one is stupid enough to pay for non-persistent data.

3 Likes

true but it would be nice to have that as optional i.e. some data I don’t care about losing within a couple of days and wouldn’t want to pay whatever the permanent cost economically comes out to be if I could pay less to have it accessed a few times and then forgotten about

There are many types of data on safe network. Only one is persistent AFAIK.

1 Like

The price drop function is continuous so long as Moore’s law applies. Eventually Moore’s law will hit the limits bound by the laws of physics, we have some promised advancements like graphene based solutions or even quantum computing but that’s not feasible yet and what comes after that? Nobody is going to pay 32x more, even businesses that need to maintain records forever are not going to be thinking that far ahead as that wouldn’t be economical.

I’m not attacking the safe network, it’s by far my favourite decentralised/cryptocurrency project out there and I think it’ll be HUGE. I’d just like to see no stones left uncovered, why risk it when we can protect the network from these unknowns. Would you mind paying again if your data could be stored for a century? Probably not but at least it sets a quantifiable boundary.

Interesting: https://en.wikipedia.org/wiki/Mark_Kryder#Kryder’s_law_projection

I don’t know if rates are holding at 40% per year - I am skeptical … but for the next few years I suspect we will be good. … After that we may slow down a fair bit unless the global economy turns around.

edit - It looks like cloud storage prices aren’t keeping up with moore’s law … although that’s more than just storage and processing speed - but considering that Safe Network will use most aspects of cloud storage servers, it’s possibly a good metric to look at.

2 Likes

There has been talk on other threads about mechanisms to archive data which is infrequently accessed. There have have also been discussions about temporary data storage.

These things have been considered and will no doubt be considered again. However, until the network launches, we can’t quantify how big an issue it is. It is best to try it and see, I think.

2 Likes

It seems to me you can make a self adjusting model that balances the limited space of the network concurrently with the incentives people have.

Do we allow people to pay a lot to store tons of new data? Do we ever discard old
data to make space for the new data?

I think the fundamental problem is that the people who paid to store the old data are not around anymore to engage in a bidding war with those who want to store new data. The price has been paid and that’s it.

Look at Reddit’s and Hacker News’ weighting algorithms. Stories have an exponentially decreasing weight so new stories can get voted to the top. I am not saying we should use that particular model. What I want to say is that they achieve this by taking the log of everything so the log-weight of old stories is CONSTANT. We need something like that in the SAFE network too. The log-price paid upon data submission during the PUT is constant, that’s the constraint. The question is what’s the evolving formula going into the future.

The other thing is that, when you take the log or other function of the weight and price, that means safecoins actually decrease in value over time as more and more are minted. This would make sense given there is more storage and so on.

Basically you need to:

  1. Model the price someone would pay to store new data as a function of the current capacity of the network and its saturation already

  2. Make a reverse function f (analogous to log being inverse of exponent) and then have the constant prices already paid be points on the graph of f

Then you get a proper pricing scheme that reflects what you want forward in time, and no need to guess about future hardware capacity.

Disclaimer: I have a math master’s from NYU

This is a great summary. Data storage always tends to get cheaper and data requirements always tend to get larger. If this relationship starts to break down, then maybe the network needs to be more discerning. Until that day, this is unlikely to be a problem, IMO.

At worst, the network will be too expensive to store temporal data. However, we have mutable data types for that purpose.

3 Likes

And where in the world do you come up with a guarantee of this assumption for all time?

I didn’t say that people didn’t pay. I said they only paid once. This simply doesn’t take into account the future trade-off between NEW people storing data and OLD people storing data. You are not properly considering the economics and simply “hand-waving” through the cost over time analysis.

There is a cost to KEEP storing something that needs to be analyzed and modeled if the system is going to work. You need to reach an equilibrium that satisfies the vast majority of people or risk the system being blown up.

My disclaimer that I have a math background is information that lets you know where I am coming from, what I care about, how I think. It’s not a weakness. Avoiding economic and mathematical analysis and hoping it will all just work out is weakness.

2 Likes

The cost to KEEP storing something is pretty negligible, though. Capacity will keep increasing, driving down cost per byte forever.

For example, someone paid a premium for a TB of storage, which is a decent amount of space right now. The standard storage capacity will likely be an etabyte in 30 years, which makes storing a TB not really a big deal. After all, people are walking around with 100x-1000x that amount of free space in their pocket, on their phone.

That’s a very handwavy analysis. Suppose the cost goes down by a smaller and smaller exponent. Suppose it even goes down by a constant exponent forever (which is physically impossible) like N = 0.9

Then guess what, the INTEGRAL / SUM of the cost function still adds up to many dozens or hundreds of times the original price! You use the geometric progression formula as a lower bound.

1 Like

That totally depends on what the original price is, relative to the storage utilized. Since you don’t know what the relative price of space is, you can’t make that assumption. If the average free space on a hard drive today is 500G, and the average storage people want/need is 50G, we essentially have infinite space, which will only keep growing. That’s not even counting those that will choose to farm with dedicated hardware and/or datacenters.