It seems to me you can make a self adjusting model that balances the limited space of the network concurrently with the incentives people have.
Do we allow people to pay a lot to store tons of new data? Do we ever discard old
data to make space for the new data?
I think the fundamental problem is that the people who paid to store the old data are not around anymore to engage in a bidding war with those who want to store new data. The price has been paid and that’s it.
Look at Reddit’s and Hacker News’ weighting algorithms. Stories have an exponentially decreasing weight so new stories can get voted to the top. I am not saying we should use that particular model. What I want to say is that they achieve this by taking the log of everything so the log-weight of old stories is CONSTANT. We need something like that in the SAFE network too. The log-price paid upon data submission during the PUT is constant, that’s the constraint. The question is what’s the evolving formula going into the future.
The other thing is that, when you take the log or other function of the weight and price, that means safecoins actually decrease in value over time as more and more are minted. This would make sense given there is more storage and so on.
Basically you need to:
Model the price someone would pay to store new data as a function of the current capacity of the network and its saturation already
Make a reverse function f (analogous to log being inverse of exponent) and then have the constant prices already paid be points on the graph of f
Then you get a proper pricing scheme that reflects what you want forward in time, and no need to guess about future hardware capacity.
Disclaimer: I have a math master’s from NYU