RFC 57: Safecoin Revised

There’s no best price, only a price one is willing to pay and a price one is willing to accept. The agents that do the negotiation will be the vaults but I don’t agree that it should be done through one specific algorithm that is forced upon the vaults across the world. As I started with it, the payments offered or demanded depend on the economic preferences of the owners of the vaults and a rigid algorithm is unlikely to be able to express that.

A solution for price negotiation is establishing a free market as I explained it a few posts (and weeks) ago, where vaults in a close group would bid for the next chunk and only the best N out of the M in the close group would get to store it and receive payment. On the other side of the transaction, uploaders would specify a maximum price they would be willing to pay, and only if the winners (the best N) in the close group demanded less would the chunk be accepted.

3 Likes

I think there can be a simple algo that balances things.

as simple as:

“need space” > “give more reward for GETs” > “people will hear that they will get good money for joining as a node with storage and get incentive to join”
“have lots of space and lots of new nodes trying to connect and offer more storage” > “put joining nodes on hold and decrease PUTs cost so: 1. the storage doesnt get bigger 2. people get incentive to load more data to the network.”

then with this simple algo it always balances cause: the PUTs and GETs costs will always make people do what the network needs.

1 Like

I disagree on that, the network should be autonomous on the pricing.

where there could be freemarket is in selling safecoin, so if someone wants to PUT 100GB and it costs 100 safecoins, he might go to the free market of safecoin and see the best price of safecoin he can get.

but PUT price and GET price I argue that needs to be fully autonomous and balance automaticly based on network needs

edit: thats because the network and only it knows the cost of the operations and has to keep a balance so it maintains the incentives for nodes to join when it needs storage and incetives when it has too much space and it needs to give a reason for someone to pay the PUT price and make the network bigger

edit2: why bigger network? because the bigger the network the faster, more secure, more powerfull it gets to offer its intended goals

1 Like

I propose that there is a UBI for the vautls and rewards for GETs.

the algo that will decide how much will the UBI and rewards for GETs be will be based on that as the network grows the PUTs will grow in a crazy way. once we get to the point that ALL world uploads data to the network the PUT prices would be enough for a calculated UBI and GET rewards cause we know that the internet moves and uploads huge amounts of data.

edit: I propose a UBI cause this gives an incentive to anyone that he will get a set amount of safecoin for just offering storage which is essential cause one to offer storage he needs to make an invesment into buying hardware and a stable internet connection so with an UBI he knows that in some time he will get back his moneys worth. the GET rewards will be the incentive that in a random way he may make faster money

edit2: and an UBI is an incentive that will make people want to have the vault available 24/7 and the get rewards will incentive for the user to get the fastest hardware and the fastest internet connection also there should be some kind of reward for even caching for the network cause the network needs caching also for rapid speeds-response

1 Like

That is a weird way to look at it. No computer network will ever get to tell me how much is my storage or bandwidth worth. In other words, there’s a real world outside. It’s not the network but the person who gets to decide how much is a fair price. Without a way to express that as a number (practically, computed by algorithm running on the vault but freely chosen by the user), they are left with a binary choice: stay or leave.

3 Likes

you dont understand that you will not get more money if there is a freemarket on the put and get side of things.

a person with the best bandwidth and fastest storage in the lowest cost will shatter a common user with his normal bandwidth and slow storage/pc so end result, you will not make ANY money if the puts and gets are in the way you propose.

does that sound logical?

edit: also lets make the assumption that 8 users got a chunk of a data and we use your method of bidding the get rewards. do you think that the network or the user that wants to PUT some data will choose your bigger price or will it choose the cheaper bid of the 8 users that have the chunk?

again the only one that knows the demand in storage or the price it should have is the network so its profitable for all vaults and its incentive enough for people and for organisations to load data by paying the PUTs.

1 Like

This discussed a lot and its not quite right. Where in the world will have a lot to say in who is fastest.

1 Like

I’m a bit confused by that sentence but I do want the vaults with the lowest price to win.

Vaults that are getting full will demand a higher price because they know other vaults will and they wouldn’t want to miss out selling their remaining storage for a higher price by selling it cheaply now. On the other hand, if somebody is more interested in short term profits or they are new and their vault is just getting filled up, they can instruct it to offer below the market price.

As you can see, the network itself doesn’t need to know anything about how much resources are available since the market would take care of that way more efficiently even the best hand-crafted algorithm ever could.

3 Likes

from the point that the user needs to keep up with the correct price I think its counter intuitive

but if you want there to be this bidding as an option, then I think I would consider it.

but shouldn’t we make the network work as simple as possible? human factor may lead to playing the network and people with high IQ and insight may manage to get more rewards from the network directly leading to loss to people that will just use the default network rewards.

1 Like

Two things:

  • work: we want an absolutely fool-proof and un-gamable method and, fortunately, a free market is where “gaming it” is the game, making it a perfect solution to avoid… gaming it?
  • simple: if there is a free market, there’s no need for the network (that is, the core software) to care about setting the price

Let’s face it: the human factor is just there. The question is, are we happy with users just leaving if they are unhappy with the current price, probably never to return, or are we willing to give them more choices?

A few things about this:

  • what’s wrong with that? (ignoring the fact that IQ is bullshit pseudoscience)
  • people would play the network anyway, so turning that into the very mechanism by which it would set the price makes sense (and it has already worked in real life)
  • the better the players, the higher the liquidity, and the better the price would approximate the “real price”

It isn’t a loss if you get what you’re asking for. Moreover, different people have different priorities. As I already mentioned, some may want some quick and steady money, some may hold out and wait for the times when scarcity would push up the price higher some more. Both group would leave if they couldn’t achieve their goals only some generalized “ideal” solution somebody clever put together disregarding such thing does not exist.

2 Likes

10 posts were split to a new topic: Debate about central planning vs. free market for negotiating the PUT price

I almost always agree with approach that enabled common people decisions and disables central authority. The problem I see here is, data are stored forever. Any pricing model should count with future all time storage costs. This is a reason I do not think any direct pricing made by deal between vaults and users can work. None of them care about future costs. I even do not think vaults should be paid for direct uploads. Who will pay them for storing all data? How can they survive mining if new data uploads are too low? This is the reason why I think we should stay with original idea, where vaults have to store data for free and earn on gets. Any upload costs are paid to network not vaults. Network with enough free coins can handle paying vaults when there is not enough uploaders for some time. Pricing should be same for everyone. Cost for upload can be easily counted from number of coins in circulation. Yes maybe some vaults can operate at cheaper costs, or maybe some people do not want to pay so much. But vaults are here for short time. First vault to store data should not decide what are the costs for all time data storage. We have good example in bitcoin. One simple algorithm can handle it. Yes it is not perfect, but miners are rational and react to all possible situation. Their behavior gives feedback to algorithm to adjust difficulty. The feedback is slow, takes days or weeks, but it works. Long term those small weekly fluctuation in hashrate and difficulty seem smooth.

5 Likes

Those are good points and I’m becoming rather clueless about the economics at this point.

On the one side, and this is why I came up with the free market idea in the first place, I don’t believe in hand-crafted control of supply and demand. Get it wrong just a little and everything will fall apart with extreme prejudice. Or not. The point is, it’s impossible to tell and that’s scary.

On the other side, you’re correct in that the forever aspect of storage introduce a serious flaw with regards to bidding for the first upload and the first upload alone.

Effectively, it either doesn’t make sense or it should be carried forward to relocations as well, but then things become complicated for joining/leaving sections. Should we demand or force refunds for the chunks that are no longer stored by the vault before we pay them for the new ones in its new section? Should we just not pay them more until they reached the same number of chunks they were already paid for? Or, … but I think I’ll just give up at this point and hope for the best.

1 Like

You offer a get you get reward, so people are incetivised to provide stable and 24/7 access. If you leave for an hour all those chunks you got will not produce you any reward!

1 Like

I know what you are talking about. I don’t believe in any handcrafted control of any human behavior, including the algo pricing. As you already said it is complex problem with all the unknown unknowns. But lets face the reality. This network is flawed and handcrafted in many ways. Data stored forever. How much should it cost? Should public data cost more or less than private data? Public data will be heavily used, private seldom. So I suggested to make them cheaper. Dirvine thinks public data are important for society, so they should be cheaper than private. Who is right? Whose pricing model is better? We don’t know, we just know it is hand crafted. Downloads are free, no payments. What the hell? FREE? Again handcrafted. There is PtP. I think it is a broken model, which will increase costs and reward wrong guys who will misuse it. Again handcrafted social model. Node aging, handcrafted. Measuring of connectivity, evaluating quality of hardware properties of vault. Handcrafted. The network is broken from beginning. It has so many artificial components that it has to collapse on free market instantly. And this is the point. There is not free market for such network. It will take years till some clone will appear. Old internet is not a competition. Storage solution aren’t neither. This network is a unicorn. We just need to make it big enough to make it hard to fail. I can’t imagine how to do free market on all time storage. It is not possible or very complicated. Free market requires to punish those who do not set correct pricing. But any price negotiotion between current vaults vs network, or vaults vs uploaders does not count with future vaults storing the data. So free market does not punish actors with wrong pricing, but network itself, which has to pay for data to vaults in future. Unknown unknowns can be both positive and negative. So we should increase our exposure to positive and decrease negative. Team is increasing positive exposure by creating so many platform supports, libraries, CLI, etc. Coin itself will be unicorn, which heavily increase probability for positive events. For the negative one, I just hope my arguments for mining algorithm that can handle extreme cases will be implemented. We have an example, worst blockchain technology among alts is Bitcoin. Any clone is technically better and has cheaper transaction costs. But still, there are different measures that make it so worthy. That is why I do not care much about whether storage costs will be perfectly optimised. But I care if there is a feedback to network that there is too much or too little free hardware resources. With such feedback network can adjust pricing. And even if it takes weeks to adjust, it will work. We just need the algorithm to keep enough coins to support very extreme situations. It is sad, but we are trying to fix handcrafted broken internet with handcrafted broken decentralized network. But it will evolve and soon we will have something better. New networks will come, and this one will evolve too or die.

4 Likes

What will cause less damage?

Not perfect algorithm or not perfect free market ?

1 Like

What will not work is trying to control the market with network algorithm

4 Likes

Many good points, but we also must realise that the network will evolve, as optimizations are made and obstacles are avoided.

In the spirit of agile development, you talk with stake holders regularly and iterate through changes to suit their needs. We cannot know everything from the start - we can just make a good stab at it and then see how it goes. As long as the team is nimble, where there is a problem, a solution can be found.

You can easily end up with paralysis through analysis, which results in a never launching network. There has been a great deal of analysis already and it is time to see how that works out. There is little point worrying about unknowns, when we simply can’t guess how they may materialise - start simple, then tailor to suit.

10 Likes

Thanks @neo, you’ve hit the nail on the head. All through these discussions my intuition has been warning me away from what felt like the temptation (of various bells and whistles) to see them as better than something imperfect but much cleaner and understandable. I think you’ve encapsulated what that nagging was trying to say to me.

EDIT: I’m also reminded of the temptation to over engineers things. I wonder if that is driven by the pursuit of perfection ending up hiding flaws rather than arriving at a solution with less flaws.

4 Likes

This is key, everything must evolve and nothing is born perfect. Nothing every gets to perfect anyway, but as long as it survives and does it’s job we are all good. Key for me these days is release with all required features with the minimum code (and here I mean take time to remove old complex code, that cost to me is 100% worth it) and complexity possible.

17 Likes