Is quasi-infinite divisibility useful?

The Safecoin divisibility thread is getting quite long spanning nearly two 2.5 years of discussion with many offshoots and interesting ideas. I thought that it might be good to consider a specific aspect of the discussion here. This is just a little brainstorming exercise for probing some extremes. If you don’t want to read the entire thought-train, or if you’re too practical just scroll to the bottom where I’ve asked a few questions to instigate some opinions.


In previous conversations, forum members have indicated a preference for having a rather fine granularity of safecoin at launch. Reasons for this have been described by others to include the following:

  • The use of the network for the burgeoning field of IOT devices; both their current and future uses.
  • Improving the network’s ability to handle micro-transactions and/or micro-tipping without the need for creating utility coins or exchanging for alt-safecoin denominations.
  • Improving the network’s ability for farmers to earn smaller rewards more frequently, rather than waiting longer periods of time for a unit safecoin payout.
  • Improving the network’s ability to better handle hypothetical future edge cases involving large amounts of lost coins.
  • Improving the network’s ability to easily adapt to human behavior and external fiat market forces.
  • Improving network utilization of improvements in technology that will reduce costs over time.

Already, @neo and @polpolrene have both presented ideas and methods for employing a local “balance” or ledger sheet counters (using a uint32 or uint64) in order to handle fractional coin amounts efficiently without the need to follow the “classic safecoin” route to divisibility as described in the appendix of the Project Safe whitepaper. The approaches are similar to that of having an account PUT balance, but is intended direct use of fractional safecoin instead.

Recently, @oetyng presented some basic figures for the case of 1 safecoin being divisible into approx. 10^11 parts as might be the case following @neo’s recommendation for a “balance method”. The argument was reasonable, but one question that arises from this exercise is, “How do we know how much divisibility is enough for all possible situations in a network that needs to persist data forever?” One way is to be a realist and be practical, since one can’t possibly consider all situations just pick something and fix it often in light of new information. The other option is to identify some sort of consensus as to the bounds on magnitude of divisibility that will suitable for long periods of time in order to improve robustness. This is challenging, we are only able to prognosticate so far into the future, and requirements vary greatly since users are willing to except different degrees of divisibility depending on the application or one’s subjective point of view. For example, I am a big fan of classic safecoin and like the simplicity of an approach of slowly divides coins as the network evolves. So my subjective point of view the matter has mostly been, “eh, let the network divide classic coins as size allows… you IOT guys can pay bots or likes in SafePUTS til then.” However, more and more I recognize that there are drawbacks to this gut reaction. The technological and marketing opportunities that open up when one considers, for example, a forum or social media app (ex. Decorum) directly charges something on the order of magnitude of 10^-18 safecoin for every thumbs-up or “like”, or a micro-robot that farms for itself in order to pay for charging it’s batteries as dirvine has described on his blog, are indeed truly vast. Applications and use cases like these will only bring in more user/investor support for the network. The same goes for monetizing other aspects that one might want to be cost free, but fear for rampant abuse of those mechanisms requires security solutions more complicated. In other words, it may just be much much easier to increase security simply by having a finite but negligible fee associated with most network features.

Considering @oetyng’s argument, rather than speculate on how much divisibility is adequate or required based on real world analogies, I think it may be worthwhile to take divisibility to it’s rational limit, ie. infinite divisibility. Since infinity is a tricky thing, let’s settle for a common definition of approximately infinite such that a quasi-infinite divisibility can be defined for safecoin. Although intuitive and familiar, I think that attempting to use real world analogies in order to come to agreement on some measure of quasi-infinite divisibility is fraught with peril. For example, one might attempt to consider two extrema at opposite ends of observable reality, ie. the volumetric ratio of the “future visibility limit” of the observable universe to that of the volume of a Planck cell, the limiting resolution of scientific comprehension (~2.0026937174703272e+185 divisibility by the way). This may seem reasonable to some and impractical to others so going this route will never lead to a consensus. I would therefore propose to keep it quasi-simple and agree to use one of the definition’s we are also all familiar with, ie. IEEE 754. This standard provides a set of clear and reasonable definitions of computational infinity, as well as maximum values which can be used to represent various levels of quasi-Infinity. Consider the following definitions from the standard for floating point numbers:

Since floating point values are not applicable for use in lossless accounting due to round off, let us consider the bit depth’s required to achieve similar divisibility with unsigned integers:

  • uint32_max : 4294967295
  • uint64_max : 18446744073709551615
  • uint128_max : 340282366920938463463374607431768211455
  • uint256_max : 115792089237316195423570985008687907853
  • uint512_max : 1340780792994259709957402499820584612747936
  • uint1024_max : 1797693134862315907729305190789024733617976978

Although both 32bit and 64bit unsigned integers both appear to provide nice granularity, note the one to one correspondence between the binary32_max and uint128_max, as well as binary64_max, and uint1024_max. I view this direct correspondence as a general consensus on the minimum granularity required to achieve quasi-infinite divisibility existing somewhere between 128 and 1024 bits. Although I said I would resort to real world analogies, consider the Planck scale example provided above which demonstrates a hypothetical 616 bit divisibility for the universe. Although 1024bits does make a pretty 32bit x 32bit square…

Relating this back to finances now, if we follow @oetyng and take the current max estimated world debt supply of $1200 Trillion, we find the different tiers of quasi-infinite divisibility yielding at the ultimate limit of economic market share:


Yes, this discussion has been more than slightly ridiculous. If you’ve made it this far, please consider the following questions:

  1. Considering hypothetical future scenarios, does a single uint64 per coin provide enough granularity for all use cases for all time? @neo describes a 64bit-depth as a balance between programming ease and reasonableness … he’s right. Even so, can you think of a scenario where uint64 would not provide enough divs?

  2. The “balance methods” efficiently employ a local wallet/purse parameter for holding fractional safecoins or “divs” to enable constant time transactions. However, they could also represent any level of divisibility as a simple array of one or more 64bit unsigned integers (ex. safebalance_256bit_divs = new uint64_t[4] ) or 32bit unsigned integers (ex. safebalance_96bit_divs = new uint32_t[3]). Multi-precision libraries make addition and subtraction easy too. Do you think that including levels divisibility >uint64 than one would see need at launch adds much of a burden to network resources and a hypothetical implementation, considering the long term goals of the network?

  3. The most appealing and simple solutions discussed in the forums for SAFE network edge case challenges seem to keep coming back to pro-growth strategies seeking both greater and greater storage capacity while also requiring greater divisibility. It would appear that the levels of divisibility provided by >uint64 balance ledgers transitions us to a notion of fluidity rather than granularity. Can you think of ways where the fluidity of at least 2^96 divs per safecoin could give benefits that would help solve any other security, app design, farming, or PUT balance related constraints/issues which you have come across in your work/reading?

  4. Is the use of >uint64 divisibility unreasonable?

  5. Is quasi-infinite divisibility useful?


I believe so.

Rewards are based on 2^64

As I’ve shown elsewhere even if safecoin reaches 1 billion dollars each we can still have nano-$$$ transactions, which really is more than needed for IoT devices. When i was deciding on 2^64 or 2^128 it became obvious that 2^128 was crazy since it represents more then the atoms in the Earth (IIRC)

SAFE_V2 will exist long before we ever need (more than) 2^64

1 Like

Yes, I understand what you are saying and what you have said. I fully understand your reasoning and felt similar at first glance. I also see the point you have made on how uses for (1/2^64) sized divs would likely satisfy all monetization requirements and if they don’t it could be fixed with an update at that point in time. This isn’t a criticism of your proposal, or an attempt to change your mind (except maybe that you at least consider uint32[2] rather than uint64) or serve as my hobby-horse as you would say. It’s just a prodding at the opposite extreme of the cannibalism thread where only the use of 2^32 network objects was my own self-imposed constraint in that thought experiment. I thought it might be fun to probe whether or not forum readers had any ideas as to applications or use cases that near infinite divisibility would/may open up or solve. In no way would I ever be so haughty as to demand features that could require holding up launch; so if you think this type of thread/thought experiment is going to be counter-productive in the long run I’ll leave it alone. That being said, I wanted to throw a few other thoughts out there for you before the thread is dead.


Very true, but which 1 billion dollars are you referring to? Australian? USD? Zimbabwean? The current system of measuring worth is always in flux. Consider 1908 dollars vs. 2008 dollars vs. 2108 dollars… can we predict the world’s economic future with any accuracy? (I know this is what you would call sidestepping… just trying to add some humor. ) Regardless, infinitesimal yet finite amounts of safecoin may have other uses inside the network beside a direct off-network relative purchasing power. What might be those uses? I don’t necessarily have the answer for that right now, hence the OP question. (Hardware/software performance metrics? Orthogonal Persistence? Votes/Likes? A sentiment of moral support for an open source project? Guilt-free Gambling?) Yes, I know that minimum PUT charge is a deciding factor.

I agree on face value it might seem that 2^128 divs per safecoin is crazy for all practical purposes (I’m hoping someone reading this has a practical example…). But isn’t a statement like this contrary to your “proving a negative” assertion from past conversations? Also, you’ll need a bigger number to represent all the atoms on earth though, something closer to 2^167 bits at least if this reference can be trusted. And what do you say to the humans who bring SAFE with them to colonize Mars? They might like an Earth backed currency out in space, literally… but I digress. :upside_down_face:

For just a brief moment let us agree that the least number of bits that are required for essentially infinite divisibility is 128, corresponding to ((2^128) - 1) divs. Now, to get closer to your point of view let us consider not the quasi-infinite divisibility of a single safecoin, but rather the sum total of SAFE network currency as a whole. Since the number of safecoins is and forever will be hardcapped to ~2^32, we’re left with 96bits worth of divs per safecoin (ex. safebalance_96bit_divs = new uint32_t[3]) to meet the minimum requirement for infinity if we are willing to agree and accept the consensus based IEEE 754 standard definitions.

  • uint96_max —> uint32[3]_max = 79228162514264337593543950335 divs/SC

This is a big number, but not quite as insane when compared to the other IEEE 754 extreme of 2^1024, and it would technically still meet a minimum standard for infinite divisibility of the network’s internal currency. Coupled with the network farming mechanisms of recycling coins when computing resources are purchased, consider PR optics and other non-technical objectives. The common and annoying questions that often comes up in the forums consist of, “When/How/Why will safecoin be divisible?” While it may still not be enticing to the engineering types and practical minded such as yourself, increasing to a minimum of 96 bits worth of divs per safecoin may be desirable for the marketing opportunities alone. Project SAFE could then advertise that “Safecoin, the currency inside the SAFE network, efficiently provides near infinite divisibility.” Anyone who would scoff could be pointed to IEEE 754 and the layperson would find themselves even more intrigued.

The problem with agreeing to a “lesser infinity” is that one is always left with sour feelings of inadequacy. Considering that the Project Safe whitepaper described the possibility for eventually achieving 2^248 bits of divisibility using “classic safecoin”, why not just allow for a balance system like you proposed with multiple precision (uint32[8] or uint64[4] for example) that more closely matches the div resolution hinted at in the whitepaper? Sets of uint32 are nice since this keeps the aesthetics of the original ico stipulations of 2^32 safecoins with 2^32 divisibility. This also opens up the opportunity for a simple non-decimal denomination system for each 32bits (2^32 troons/SC, 2^32 irvines/troon, 2^32 divs/irvine, etc.) since about 4 billion parts per denomination level is a lot easier to manage for the layperson, as compared to about 18 million trillion via a uint64. Although, even in this case I’ll admit that even in this example more than four levels of denomination becomes unruly. Of course, due to the high degree of divisibility the customary decimal transfers are trivial, it’s just that trading in @happybeing’s troons may be just has fun as safecoin from time to time. :smile:

Yes, the reasoning for the minimalist KISS approach of a single uint64 is quite logical; but at this instant the major concern I would see against adding a few more bits is “the waste of space” for functionality that (subjectively) isn’t seen as being necessary in the short term. However, the problem with the “wasted space” perspective, as you’ve proved to me prior, is that it becomes a question of relativity for a maximum-growth network regime. Maybe we can bound the magnitude of this wasted space? I know the following is also completely subjective, but let’s play with some numbers:

  • Average number of accounts controlled by a unique human including IOT management = 128
  • Storage required per unique human per 32bits of divisibility = 512B
  • Assumed network redundancy: 16
  • Network storage per 64bits of div/SC per unique human : ~16.3kB
  • Network storage per 96bits of div/SC per unique human : ~24.5kB
  • Network storage per 128bits of div/SC per unique human : ~32.7kB
  • Network storage per 256bits of div/SC per unique human : ~65.5kB
  • Network storage per 512bits of div/SC per unique human : ~131kB

Relative to all the other data stored by a single user, any one of these divisibility levels requires negligible storage resources. If I am not mistaken your balance method incurs O(1) constant time/bandwidth computational complexity per fractional transaction, regardless of size. If this is the case, then why not just let the People have a gratuitous number of bits for quasi-infinite divisibility to experiment and play around with? It then becomes a non-issue, forever, since people could never complain if the amount of divisibility they have been given is essentially infinite. In SAFE, it’s about wants not needs right? It’s about secure access and letting the user decide, right? You may want IOT, I may now want infinity, others may want the network bots to serve up a cup of tea, but none of these are a need or requisite for launch or otherwise. Humans don’t really need computers either, but they’re fun tools to have around… :wink:

I was basically referring to any dollar similar to USD. EG AUD, USD, EURO, Pound etc. They are all within a few hundred percent of each other. For IoT etc it is not unreasonable to consider nano-USD (or similar) and other currencies that have 1000’s or millions per USD might not call it nano, but hey its value is still the same.

We typically talk of micro-transactions more based on value rather than actual 1/1000000. So to say nano-transaction we would be pegging that to something worth approx 1 nano-dollar.

We also have to assume some things like we don’t see hyper inflation, we are not going to see safecoin rising by 10 times each year or two repetitively once its released. If it does then we’ll have to get to designing ways to handle this. Market cap of 4 billion billion dollars would be amazing.

For safecoin to reach 1 billion dollars due to fiat inflation rather than the coin value rising would suggest we are in for very hard times financially indeed :wink:

I certainly was not trying to shutdown any conversation on this. But summarising my findings

Another thought, is that you would not need to divide safecoin but any “coloured” coin could do the job. Design a ledger system based on MDs (or chunks) and implement the concept.

People could spend a safecoin to the APP and get 1,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000.000000000 infinite coins.

They could return the same number and get a safecoin back.

I did consider 2^32 division but simple thought experiments suggested it would not provide nano-$$$ transactions once safecoin got to 5$

I agree that having more than 64bits for division does not radically affect any storage requirements and could easily be implemented. Also you could implement a 2nd version to the balance value (field in MD) where the network recognises V1 as say 64 bits of fraction value and V2 as 96 or 128 or ??? bits of balance and uses the appropriate interpretation of the balance. Also then upgrades the balance field to V2 whenever it is given a V1 field by changing the field to the V2 size.

I forgot to mention also that another factor for choosing 64 bits for the initial size for the balance field was that u64 was a standard variable size where u128 was not and required double additions/subtractions. I am an machine code programmer from way back and these things still haunt me.

True. I wasn’t advocating u128 so much as a vector/array of uint32.
So 64bit = uint32[2], 96bit = uint32[3], 128bit=uint32[4], etc. Unless I am mistaken, you would never need to do any multiplication or division, so the math on these is simple carry add/subtract… since they’s not signed (negative currency = debt?) then it would be easier than this.