RFC 57: Safecoin Revised

I personally like predator-prey because it seems accurate and fits into the bio-mimicry of natural systems that is often in Maidsafe’s engineering.

1 Like

Predator-prey can lead to this:

More specifically: Chaos in Low-Dimensional Lotka-Volterra Models of Competition

I would prefer something less prone to chaotic behavior.

5 Likes

They’re looking at competing species though so not so sure that fits as farmers aren’t really a competing species but one that follows the same rules for the limited resource of (Safecoin). I guess you could even think of farmers as a pack in a way. It seems as though to me it’s a more pure predator-prey model simply because there are only one species each of both farmer and uploader.

I’m not qualified on the maths so my assessment is purely high level, just my take on it.

1 Like

How about:

consumer/worker --> shopkeeper --> consumer/worker

Maybe not abstract enough?

Otherwise I prefer either the tides or the spring-mass-damper model you listed. The others have negative connotations for me.

2 Likes

I still feel the original idea in the safecoin algorithm was a reasonable way to address the issue.

  • Somewhat based on the free space
  • Had a scarcity component where the more coins existing the less coins that would be created based on the results of the algorithm.

My thoughts were to add some parameters to that

  • lower limit. The 2^-63 lower limit was not going to be good since a reasonable amount of free space meant near zero cost to PUT and near zero farming rewards
  • upper limit of 1 safecoin per PUT was also somewhat high, maybe not as bad as the lower limit of 2^-63. Maybe min price should be around 1000 nano safecoin and max could be 1 safecoin still but only if the free space is virtually nothing (which hopefully never happen).
  • Rather than an inverse law for determining safecoin price, something between linear and inverse function so that the price rose from the lower limit faster but was slower at rising when space was becoming less plentiful rather than the inverse function where the price only really rose when space was very short on supply
  • The original idea put the difference between (near) min and (near) max price as very small in terms of the % free space so saying the previous point differently the range of the % free space needs to be much larger for min->max price of PUT
3 Likes

I think best would be starting out with the most basic criteria:

  • the network should never run out of coins
  • the network should never fill up

NOTE: As @neo has noted, I messed up the order of these points but you’ll figure it out :slight_smile:

The first point could be guaranteed if the PUT price approached infinity when running out of space, so we’d want “free space” (or a suitable proxy) in the denominator for the price function.

Similarly, the second point needs that rewards approach zero when running out of coins, so we’d want “free market cap” (or a suitable proxy) in the numerator and/or “coins in circulation” (plus constant) in the denominator for the reward function.

We’ll just need to shoehorn in whatever desired behavior is not achieved by the above, though they already do most of what we need, I think.

Then, some parameters, e.g. what should the theoretical maximum reward be (when there are no coins in circulation) or how flat the price / reward rise should be when we’re not near the extremes (very empty or very full network, very few or almost all coins used).

Then, some technical details: smoothing out changes (averaging over time, Kalman filtering, etc), including some headroom for the storage space (just in case), and so on.


The big question is, do we need to connect the storage cost and the farming rewards at all, or could they work as two independent sides? Could it lead to chaotic oscillations and, if so, can it be stopped? (For example, one side could be dampened using a much longer period than the other.)

7 Likes

Actually that is your first point.

The original algorithm used a method to ensure this virtually never happened.

It did this by doing the calculation if a coin should be issued, then after that used the coin address and if the coin existed then the coin would not be issued.

Thus on an average basis the coin creation would occur in (coins existing)/(max possible coins) cases

Thus when 90% of coins issued only 10% of the time would a coin creation attempt happen.

In a sense they need to have common driving forces and that is spare space.

But maybe some sort of moving average could be used by both (but separate) so that massive swings do not occur, but the price/reward calculations are much more smooth arriving at the instantaneous price calculations more slowly and really only the average or mean of the instantaneous prices.

3 Likes

That’s true. But we can’t use that method anymore, can we? So, a new method is necessary.

Maybe the farming reward needs to include something about free space too, just less aggressively than the PUT price.

Should the PUT price also include something about the amount of coins in (or not in) circulation though?

Kalman filtering may help here. We (well, a section) would make a number of imprecise observations over time and use them to calculate a good estimate of the actual value. Something like that.

2 Likes

How about getting rid of the term “coin”?

I think it is bad in bitcoin too, especially nowadays, when the value is far beyond the “coin range”. Also it sounds very un-digital, and can create confusion now that we have this common coin -algorithm as a part of PARSEC.

I don’t have time right now to give better alternatives, though.

3 Likes

EDIT: this was a progression of of ideas and the first part was musings
Maybe we can, since the section is responsible for a portion of the 2^32 coins, it could do a bitmap of issued/spent coins. So if 1 section the bitmap is 2^32 bits in the section. If 1024 section then bitmap is 2^22 bits etc.

Now this would allow the section to say issue milli coins and have a bitmap that is 1000 times the coins it is responsible for.

But really a bit map is not needed for implementing a scarcity function. Just knowing the number of “coins” existing in the section could allow this.

Say the section is responsible for 2^24 “coins” and there are 2^23 coins issued (50% of the coins) then every 2nd farming payment that was to be done results in a payment occurring. Or another way is to multiply the payment determined by 50% every time. Of course you could extend this to smaller units like say nano or micro or milli

2 Likes

Just trying to get my head around this.

So each section would be paying out slightly different amounts over time?

How do sections get a balance … do sections divide over time and hence have their safecoin balance divided as well?

1 Like

Not as much as denying it as a possibility, what would the point be? Suddenly, there would be a need to store a bitmap for no other reasons than something that could be done much easier in another way, just as you described later on. Coins as network objects was a great idea, I love it, but there’s no point to retrofit its remnants into a fundamentally different concept.

If we had nanos, there may be no need for probabilistic rewards even. They would be a useful fallback for when safecoins are expensive though.

They get half of the original section’s when there’s a split. Other than that, they are the ones who maintain the balance, so they know exactly how much is free and how much is used.

That’s unavoidable, yes.

3 Likes

Each section divides in 2 and takes responsibility for 1/2 the XOR address space the original section had. The same applies to the 2^32 possible safecoin (or could say 2^32*10^9 nanos)

So if the section is the result of 8 divides it then is responsible for 1/256th of the XOR space and 1/256 of the coins

3 Likes

Which is why I moved onto the progression of the idea and removed the need for a bitmap

EDIT: and I agree there is no need for the notion of the previous implementation. Maybe we could go to the max of u64 and have 18,446,744,073,709,551,616 nanos == just over 18,446,744,073 coins and then issue 4,294,967,296 nanos (approx 4.3 coins) per MAID. No inflation and no loss for anyone. But increases total supply for the future and allows @anon86652309 to realise full use of all bits. Only downside is people’s perception of imaginary inflation.

The idea though was to account for coin scarcity in the section to be used for payment. So as more is issued the %age of the amount determined paid is slashed by that %age.

So if the reward payment is determined to be 22 milli safecoin then

  • 10% issued already then payment is 19800 micro safecoin
  • 50% issued already then payment is 11 milli safecoin
  • 90% issued already then payment is 2 milli safecoin
  • 99% already issued then payment is 220 micro safecoin

So as the issued coin approaches the 2^32 safecoin then actual payment is reduced more and more from the calculated reward amount. In the end almost impossible to ever issue all coins.

3 Likes

I know: “… just as you described later on”

I was just referring to a recurring theme in this thread in general.

3 Likes

It is not going to happen to reach near zero cost, because price of SafeCoin would rise to xxx (also farmers rewards) and uploaders would use benefit of cheap PUT for sure. It is more easy to upload 100PB than increase available space by 800PB.

1 Like

The old algorithm allowed this whenever spare space was over the amount of used space.

Yes it would get used quickly, but it still would exist. Lets say early on people are adding 2 to to 4 TB each vault and there are 80,000 vaults. So that is about 30PB and so would need 15PB stored before it rose above that 2^-63 safecoin put PUT

It will take time to upload 15PB and remember that PUTs are used for a lot more than straight 1MB chunks. Messaging, updating ADs, creating accounts, which are way less than 1MB

The low price affects more than just storage

BUT in saying all that I agree that the 2^-63 will not happen that often, but early on it is real easy in comparison to later on.

2 Likes

Maybe these two post can help. We both reach the same conclusions in different ways.

4 Likes

Is there any plan for sections to share information generally and specifically regarding total puts and existing storage?

I guess that such info sharing might be useful, but also becomes an attack vector. As would re-balancing safecoin holdings between sections. So I’m guessing this info would not be available.

If it is sharable though, then perhaps this info could be useful for modifying farm payouts.

2 Likes

I think the proposed farm reward (FR) algorithm in RFC-0057 is quite useful as a starting point.

FR = 2*SC

FR is split between all farmers (weighted by age) and devs/producers etc. I’m only going to discuss the total FR here, but the changes would keep the split mechanism in place.

The change that might make this more sustainable would be to replace 2 with a dynamic multiplier. It would be very similar to the idea of Farm Rate but to avoid overloading the term I’m calling this the Health Multiplier (HM).

The formula would become

FR = HM*SC

where

HM = 2/(4^FC) and FC is percent of farmed coins between 0 and 1.

health_multiplier

So to expand it fully out,

FR = 2/(4^FC) * (1/G+F/N)

which makes the reward depend on a) total farmed coins and b) portion of full vaults.

For 0% of coins farmed, HM = 2, ie same as the current RFC-0057 proposal.
For 50% of coins farmed, HM = 1
For 100% of coins farmed, HM = 0.5

For 10% of coins issued (eg network start with ICO allocated) HM = 1.74
For 90% of coins issued (eg a lot of uploading has happened suddenly) HM = 0.57

HM = 1 means that FR=SC, ie the reward mechanism acts as a direct transfer from uploaders to farmers with no bonus or reduction from the network.

HM = 0.5 means that farmers only get half the upload fee, the other half is kept by the network for future use.

It’s worth clarifying my perspective: safecoin is primarily a tool for manipulation. It’s meant to encourage or discourage certain behaviours by uploaders and resource providers depending on the condition of the network. The purpose of the manipulation is to optimise network health and minimise wasted resources.

I use the fairly hostile word ‘manipulate’ because the network is an entity that’s going to need to defend itself from malicious behaviour, and the coin is the main tool it uses to protect itself.

The maximum potential for manipulation is when 50% of coins are issued. More than that reduces the remaining coins to encourage new farmers, and less than that reduces the supply of coins for uploaders to use. So my feeling is the network should be aiming toward 50% supply with some considerable elasticity during periods of extreme activity. HM*SC should hopefully achieve that.

It’s still a very rough idea but it’s a small step in a certain direction. A lot of words for what is really a fairly simple idea of changing constant 2 into a variable.

I think the disallow rules for new joining nodes would need some further consideration. And so would the StoreCost algorithm since I think it’s probably too restrictive in RFC-0057. Interested to hear any thoughts about the Health Multiplier idea.


This is (imo) inevitable but I feel you’re starting the conversation many months too soon! I’m happy to see this comment though!!

8 Likes