Digging more into the RFC, I feel the farming rewards formula also is likely to need to be revised. Aside from my previous criticism that response speed (and bandwidth) are unrewarded, it also appears that provided storage has no impact. Thus a vault providing the minimum to meet the resource test and throttles its bandwidth thereafter gets the same reward as a multi TB array at the same uptime. Incentivizing more vaults adds decentralization, but at the expense of increasing the expected number of hops and thus likely network responsiveness for messaging and other applications. I thus posit that there should be some counterweight at least rewarding for either offered or used size. Perhaps a multiplier of ln(offered_size/minimum_for_resource_test) to the weighting.
It may add a bit of complexity, but should be fairly simple to monitor…and I suspect the storage reporting is necessary for an efficient and stable StoreCost algorithm anyway. if a node advertises x amount space then it should never reject an x-1 sized request to store or it should be punished.
The target should be to set algorithm to be able in early stages to have about 1TB per 1SafeCoin.
I’m thinking of a couple alternative formulations, but suspect store cost should be in the format of min_SAFE_divisibility*(1+calculated_network_premium), e.g. the minimum cost is whatever SAFE’s equivalent of a satoshi is (nanoSAFE sounds good to me).
I have to agree. This seems to be a weird relic of the past, almost like some unholy marriage between the old type of farming (discrete coins) and the new type of storage (balances).
Talking about the basic units of safecoin, I’m not sure how I feel about this particular detail:
The parts field represents a multiple of “250 pico-safecoins”, i.e. the number of 250 * 10^-12 -th parts of a single safecoin. The total value of the parts field will be required to be less than a single safecoin, i.e. it will always be less than 4 billion.
250 pico-safecoins, seriously? What sort of black magic is that? Four billion is the largest round number that can be represented on 32 bits, right? So, just a random artifact, elevated to a position it clearly doesn’t deserve. Ugly.
But I get it. It must be an attempt to ensure the total market cap remains as advertised, right? I say, screw that. A single number with 2^64 units makes a lot more sense, it’s easier to work with, and I bet my uncle’s top hat that we’ll end up with it even if starting out with the split balance because the MaidSafe folks have always been ready to throw away way less idiosyncratic things with way more work already put into them, and I don’t think this will be the exception.
Maidsafecoins could be just exchanged to 2^32 of unit-safecoins to get the same value. Some children will throw a tantrum, so what.
With all due respect, that’s still just a stupid artifact, not a conscious design decision. I mean, nobody wakes up and goes, “Hey, wouldn’t it be great to set our basic unit of value at 250 pico-something?”
I dunno. It has a nice ‘Old Money’ nostalgia about it for Brits of a certain age. I can just about remember this.
Money was divided into pounds (£ or l in some documents) shillings (s. or /-) and pennies (d.). Thus, 4 pounds, eight shillings and fourpence would be written as £4/8/4d. or £4-8-4d. The “L S D” stands for the Latin words “libra”, “solidus” and “denarius”.
20 shillings in £1 - a shilling was often called ‘bob’, so ‘ten bob’ was 10/-
12 pennies in 1 shilling
240 pennies in £1
Pennies were broken down into other coins:
a farthing (a fourth-thing) was ¼ of a penny
a halfpenny (pronounced ‘hay-p’ny’) was ½ of a penny
three farthings was ¾ of a penny (i.e. three fourth-things). There was no coin of this denomination, however…
I should add I think if this rfc were implemented it would provide some very interesting and valuable data, so even though I say I think it needs changes I’m not excluding an implementation and test of the existing ideas… it may actually work quite well as it stands.
All the vaults of a section will have, more or less, the same amount of data because the data is distributed randomly through the XOR space and each data is copied into the Vaults closest to their own XOR address. Only when a Vault is surrounded by saturated nodes will store relocated chunks. This makes a multi TB array Vault, usually, totally oversized and economically inefficient.
We must never trust what a vault declares, only the work it can prove. We can use a proof of storage but, in any case, the ideal is to confirm that a Vault has the necessary space but not reward those who have surplus storage.
To me it still seems slightly clearer, but the point of putting together the example and linking to the generated docs rather than the code was to make it clear that the internals of Coin are an implementation detail. You can’t tell from looking at the docs (which represent the API) whether it’s implemented using two u32s or a u64. If I were to redo this with latter, the generated docs and API will be identical.
Ah, this is maybe pointing to the root cause of our difference of opinions. You’re proposing that front-end/app devs (users even?) are to use fixed point integers? I do generally like passing the buck to front-end where we can, but in this case I think it would be better to just provide fundamental safecoin types which “do the right thing”. I feel I’ve used the type system to make it safe for anyone, front-end or back-end to use. If someone wants to exclusively work with nano-coin because it makes more sense to them, then they can use the provided NanoCoin to do that.
Yes, we expect the formula will indeed need to be refined as we get more data from testnets. We’re aiming to get a very basic formula in place for now and keep iterating (probably via the RFC process) to get towards something fairer and more useful.
I do agree with your points though. With the proposed divisibility of the coin and the ability of elders to accumulate micro-rewards for all farmers until they pass a threshold before an actual coin transfer is done, I don’t see much stopping us from easily rewarding every little aspect in the future.
I will be blunt that is bad thinking. It is not clearer for back end storage and will cause you more programming and potential avenue for bugs to creep in now or later when someone else is upgrading.
It seems @anon86652309 you are trying to keep the concept of actual coins and then subdivision. Once you brought in the balance, you shot the idea of separate coins with a 12 gauge and threw it in the furnace. Which is quite good in order to give division.
Just do what everyone has done for 6 decades and store the WHOLE balance in one variable and treat it as fixed point and nuke the avenue for bugs dead. And save a number of extra instructions to account for under/overflow when adding or subtracting the stored as split balance
Oo why would we want different structs for representing the same thing in different sizes? That’s what numbers have been invented for (measuring) and if you do it like this depending on the amount someone wants to send I need to divide the amount by different factors and call different api functions then from python I assume
Just having a number and a fixed factor would be way more intuitive +the high level api can provide the different units then it’s easier there anyway I assume …?
Ps: ooooooooh - the c api is in your opinion the ‘high level interface’… Well… Did you try to use it? Imho every additional api function that is not required just complicates things… The function interface needs to be precisely reconstructed on the other side… Callback functions need to be created… What precisely is being returned/how it’s handled needs to be defined… So one safecoin interface and just throwing in different numbers looks like a clean solution to me (you can still turn it into different sizes internally if that makes sense )
I’d be fully in favour of representing a safecoin balance as a single 64-bit integer if we didn’t have divisibility of the coin. Had we, back then, said the upper limit on number of safecoin will be 2^64, and they’re not divisible, then this wouldn’t be an issue. That’s not the case. I believe we’re in a position where we need to handle fractional parts of a safecoin.
Did you get a chance to look at the docs for the example code I wrote? I’m failing to see how that’s particularly hard to use.
I’m not overly strong on keeping that level of divisibility. I don’t see it as something that many people would ever need to care about or even be aware of. It might only ever be something that backend uses. I just don’t like wasting bits
No, the level of divisibility is unrelated to the total number of possible safecoin.
Well, never say never But in this case, I’d be surprised.