RFC 0061 — Safe Network Token Distribution

As we would be increasing the size then we can use a straight and safe conversion (into()) from the u64 to u128 so it would be simple in the DBC code when bulletproofs etc. are ready. i.e. no need for 2 types or codebases here as we can convert on the fly.

10 Likes

Interesting. Thanks for linking+quoting this. When I joined the forum and in the time since, the oft repeated explanations by prominent member(s) for the initial distribution maintained that only the ICO and maidsafe investors will be the entirety of the coin supply at genesis, and that the remainder will be farmed over many years (I can point to these posts but don’t want to distract or blame the OPs). I wish that Maidsafe would have corrected those erroneous details of the initial distribution that permeated this forum because it was clear that the white papers were irrelevant from a technology standpoint and that the forum had the latest info, so it made sense to also rely on the forum for other details rather than the obsolete white papers. Nevertheless, the fault is fully mine for having come to believe the ‘maidsafe investors and ICO as only initial supply at genesis’ story. So mea culpa, the distribution proposed by the RFC is in line with the original proposed distribution.

With that said, I’m still opposed to the RFC but with the understanding that my position is actually not in line with the original proposal (similar to my discomfort with Pt*). Outside of the unavoidable situation of the ICO and Maidsafe investors, and maybe a small tranche for core devs in line with the ETH example if you must, I would be a lot more comfortable with a distribution that is fair per the precedents set by bitcoin, monero, etc. Which overall would put it in line with ETH’s initial distribution. Anything more (i.e., current RFC) puts it in line with the VC-controlled networks’ distributions and we know how centralized those are. If the network is centralized in the hands of a few, then any public/decentralized utility built on that network is also by extension at the whims of those few. I am personally far less enthused about building anything substantial on such a network.

If a fair distribution can’t be solved technically in time, I would rather the entire supply go to the aforementioned two groups because nameless thousands controlling the network’s fate is better than a few members of a committee doing the same. Even the whale dumping concern I found to actually be a positive because it will spread the distribution and network control further.

I very much sympathize, being one of those waiting to build on safe; I actually hired and had to eventually let go as the network was repeatedly delayed meaning that I’ve lost money so far on the endeavor. Still, I strongly disagree for several reasons among which the following three:

  1. It’s the dev’s responsibility to create something of true value. Having committee’s decide what’s valuable rarely work. Committees are nearly always against truly innovative ideas (e.g., the vast majority of Nobel-level science were derided and not funded at the proposal stage). True innovation is only accepted by the crowds and lauded by the committees after they’ve shocked the gatekeepers. Given that, I think it best to let motivation, innovation and the market decide which project sinks and which swims.

  2. There are other avenues for funding. For instance, the decorum project did not have trouble getting funded by the community/market, and they have made shrewd choices that have kept them on a strong financial footing to build the project. Would a committee have handled the decorum situation better than they have themselves or just burden them if not outright derail them? Similarly, a bunch of other (then future safe network) projects had no problem getting funded but their story was very different because they made different choices. What would a committee have done? Tried to have the same outcome as decorum’s for these now defunct projects or funnel additional resources into trying to save them?
    Externally, the ETH ecosystem and others offer proof that leaving funding of community projects in the hands of the community is better than to try to control it centrally.

  3. We now have the benefit of hindsight, precedents, and scholarship that other networks and decentralized ecosystem participants have set. Maidsafe and some forum participants rightly point out that the environment has changed from a regulatory standpoint, potentially requiring adjustments (by the way, the evidence shows that only true decentralization solves these headwinds). Similarly, the environment has changed tremendously from a technological standpoint (safe network’s technology has changed substantially from what was envisaged being needed in 2014). By the same token, the economics and systems of control of decentralized systems have evolved substantially since that time (meaning there are a lot of lessons learned). It makes little sense to accept lessons learned in the regulatory and technology layers, but ignore them on the economics and governance side.

14 Likes

This is an area where I am in complete agreement. This whole discussion is so very important for this very reason.

The tension I have is really the ability to code a solution (none of us can see how we can right now) and the formation of a trusted org (the current route). I feel the same on a personal level that true innovation is an anti-spreadsheet box ticking agreements of some group.

So while I am conflicted I am also comfortable that this is being investigated and the creation of the group is also pushing boundaries of innovation, to not be “that group”.

My current thought is to code what we can and not be daft about it (magic numbers etc.) and govern what we cannot. The latter part is as much an issue as coding a solution. I have recently been alarmed at humanity’s attempts at representative democracies and populism getting control, almost as much as the “code is law” teams. Both seem to fail in drastic and dramatic ways.

So while this may seem like a cop-out I hope it’s not and instead seen as a real look into the abyss to find a workable solution.

I hope folks don’t see any fait a compli here as there is not one, at least I don’t think so. I see a discussion that is progressive and hopefully looks at the reality of what is possible and more importantly what just looks like it’s possible. I hope to do this exercise we find those eureka moments and for this reason, I have no side in this discussion, but I am really intrigued by it.

19 Likes

1.) farmers get bonus under this scheme.
2.) MORE farmers stay on the network.
3.) Network farming algo reduces charge for data upload as there is an excess of capacity.

It IS a subsidy. @neo, we had this discussion a couple of years ago. Why do we need to have it again?

Just gonna throw this into the ocean:

People don’t like the idea of giving “the reserve” to Maid token holders as they fear they would be dumped onto the market and cause price fluctuations.

That reasoning is dubious at the start, but regardless, there may be a solution here …

Encrypt reserve DBC’s with a weak encryption layer. Then before they can be spent onto the network, their individual encryption wrappers have to be hacked open (proof of work).

This eliminates the concern of dumping and encryption method can even vary in difficulty in order to make some harder to crack than others, thus limiting the production over time.

It also eliminates the honeypot threat of network holding large amounts of tokens.

Ah I see the error. The network only allows new farmers when needed. So this doesn’t occur. All that happens is that the network will have enough farmers more often.

Because of changes, no longer are farmers joining whenever they decide. The upload cost only rises when there is major lack of farmers.

Even if it was like before its simply adjust the upload costs to prevent the artificial drop in upload costs. This was an implied situation if you pay bonus.

fair enough, but you leave out market prices then. Farmers will sell for equal value to their costs + some profit. So bonus causes them to sell for less than they would otherwise need to do.

Lower farmer selling price means lower price for SNT in the marketplace, which in turn makes it cheaper for uploaders to upload.

So it’s still a subsidy for the upload cost. You can’t get around the market.

Had a couple more thoughts on this, so adding:

As this is a proof of work AND as all encrypted tokens are in the wild, it will function to moderate the price of SNT automatically - as price goes higher the cost to crack the encryption becomes relatively more profitable. As the price goes low, the price to crack becomes less profitable.

This would stabilize the SNT price auto-magically and is something bitcoin can’t do as it’s rewards are fixed through it’s difficulty adjustment.

Yes, this is crucial.

I’d also just like to point out what the RFC points to as the end-state, the Plan A: only Core Developer rewards being administered by the Foundation—a tiny fraction of the overall Royalties—plus the costs associated with administering those funds. Everything else is handled by the Network:

  • Resource Rewards
  • App Developer Rewards
  • Public Data Rewards
  • The Remaining Supply

Core rewards, on the face of it anyway, is one of the few areas where automation is going to be tricky. But increasingly decentralised, community driven approaches to administration of Core Dev Rewards is the direction too.

11 Likes

That’s not the primary reason. The dumping just happens to be one potential side-effect.

The primary reason is the same as the squeamishness people have over the Foundation’s custody. The centralising effect.

It’s giving control of the almost the entire Network economy to a small number of people, rather than allowing that control to be in the hands of users and contributors to the Network. Fair access, that doesn’t require you to have got in early doors, or had the foresight and financial means to invest etc.

I’d also like folk to think that should we put up the flag, that 80%+ of the future Network economy will be accessible via MAID right now, how that might open a vector for control or attacks in the future.

4 Likes

I suppose you’ve a poll then to back that up? … That’s what I thought. Sorry for the snark, but you cued yourself up for that one.

a small number of people who for the most part want to get the most for their tokens … meaning they, for the most part, have the best interest of the network in mind.

This borders on paranoia to my mind (assuming I am understanding what you are saying correctly). Maid hodlers are the best out there - you can see this because of the low volumes - we don’t want to trade it for other things. The evidence is that we do have the best interest of the network in mind.

It’s those who come in later (post launch) and use large amounts of capital to buy mega stakes - often via OTC markets, who then start messing with things.

If the network lets out too many of the reserve too early, then it seems to me that SN could suffer more, not less.

Again, we’re providing for those risks in the RFC

To some extent. Other suggestions have been made to improve upon that.

EDIT: wondering if it’s possible to get past just defending the RFC and instead to attack it to find it’s weaknesses? And to then also debate and explore alternatives.

1 Like

Committees have a bad image in this context, but maybe we’re not talking about “a committee”? That seems to be an assumption rather than a proposal, but maybe a) not all committees have to be bad at this, and b) maybe there are ways of doing this that are not done justice by the term committee.

@bogard @dirvine I’m guessing that one reason committees are seen as risk averse is because that’s how they tend to be incentivised. It’s actually designed that way because those providing the funds want value and see that as the way.

It’s a bit of a cliche too though isn’t it? I don’t know much about organisations which have been set up to fund innovation by but I suspect there are good examples. Aren’t technically innovative businesses doing exactly this, also some VCs? I have a couple that seem relevant from my own experience, both of which I now see were practising a form of decentralisation, so that’s something to note: maybe decentralisation can be applied to the Foundation.

Example 1 - Contract Research and Development

I recall how this company I worked at had a flat structure of technically specialised groups (optics, digital electronics, analogue electronics, mechanical engineering, software, ground radar etc.). These groups were kept small, they often started with one or two people, grew to about ten and then would shed, split or, where a technology was no longer in demand for the projects that clients brought, it might be disbanded and the engineers distributed among other groups. The groups provided a mix of skilled labour including specialists, generalists, problem solvers, creative thinkers and even the odd inventor. All were technically qualified, selected for potential and most with a mix of scientific, engineering and technical management expertise.

The projects brought people from different specialist groups (optics, digital electronics etc) together to solve client problems such as developing innovative products, prototypes or one off solutions that required expertise they didn’t have or was beyond their in house capability.

A group member might spend anything from an hour to man years depending on a project’s needs and your interest. Each group had a leader whose job was to develop and sell the skills of their group. Each member could be asked to do whatever they were capable of providing, often choosing what to work on and supported in developing new skills and interests whether in the area of the group or not.

Example 2 - Racal Group: multi business technical products and solutions

Before I landed at the first example I interviewed for Racal Group who took a similar approach but at the level of the business. They would spin off a new company to provide technical solutions with a blend of technical and marketing expertise targeted at one kind of solution of another. When a company grew above a certain size it was forced to split.

I suspect the flat structures of these two examples (which used largely self managing units) are common in innovative environments and that the necessary features could be applied to the distribution of funds by a Foundation.

I also did some due diligence work for a VC so I know there’s also a role for outsourcing, for example when the Foundation doesn’t believe it can assess some aspects of a proposal.

Maybe VCs would be a good model here, perhaps tweaked for our particular purpose? Whatever the case I think having the right people design this aspect could be crucial.

Maybe be there are other successful approaches that I don’t know about too? I’m only talking anecdotally so don’t claim expertise on that question.

Decentralisation isn’t just for code!

Both the examples I described used decentralisation of control to foster and reward innovation and creativity. The subunits in each case were accountable for agreeing business goals, their plans and of course progress, and had a lot of flexibility in how they went about it, though with oversight. But oversight by people who were of like mind, who had been there and done exactly the same thing rather than people from outside who lack that experience of delivering innovative technical solutions.

We might be able to learn from this, not just for the Foundation but also in any automated of semi automated mechanisms for distributing funds.

This idea isn’t new. David has been talking about seeding ‘pods’ to decentralise development for years but now it looks like the ‘game is afoot’!

9 Likes

It’s funny, and also exciting, that you should bring these models up, because it’s these very types of ideas we’ve been pondering and discussing internally with regard to the functioning of the Foundation.

This is certainly discussion that is going to be more specific to the governance proposals and white paper, but yes we are looking at flattened and modular approaches to how the Network and protocol could be stewarded and guided along its journey.

And it’s not just the Foundation either, decentralised approaches to how MaidSafe, or it’s future incarnation might operate too. It’s exciting times.

I couldn’t agree more.

9 Likes

Decentralization is key of course. That’s why I’ve been a supporter for so long. I’m not clear on how this RFC is about that. Am I wrong that the likely result here is that the Foundation, run by people, will hold potentially billions in capital? Because, no matter how good the intention, that level of temptation is beyond most every human being and having such contained and controlled by a small group appears far more centralized than the alternatives.

Or have I missed something here again?

3 Likes

We’re all working on the basis of assumptions about the Foundation because we don’t have detailed proposals on that. Our comments are I think useful but I doubt many of the concerns are new to MaidSafe given the long term commitment they have to the values and goals that the community has formed around.

2 Likes

This is not clear, but also not the intent. So initial distribution via airdrop or whatever will happen, potentially human involvement, but strict rules and done quite quickly.

Then the foundation is looking at the 5% core dev rewards as they happen

So it’s how do we distribute these to as many devs as possible from all fields and geographies. It will mean encouraging devs and so-called competitors to maidsafe (I call them partners whether they see it or not).

Then the app dev rewards …

So it’s a dive into how that conduit is managed and how that conduit generates more users, wealth and most importantly value for humanity,

This is what there is no fixed rules in place, but looking for a path that lets this whole project grow for every person on the planet.

So we need to keep diligent, stay a bit wary and check under every stone that there is as few ways to screw it up or have folk take control or steal etc .

8 Likes

And the greatest “subsidy” (flow on effect) because all those tokens will drive the fiat price right down. 85% instead of 15% immediately available. The Bonus flow on effect on fiat cost to upload is tiny by comparison.

This is using the same logic. That is why the bonus over time and tapering off is the better way than dumping the whole 70% at once into the available tokens out there.

If you say raise the SNT required to upload artificially then you reduce the value of the 85% back to what the 15% would be with a bonus (or none) release. But no more to be released. IE kill off the price.

Regarding Dev funding, I think it’s not a good idea to offer money for promises. Better to just build a sound system and then allow those who want to build on it to earn enough to do so … otherwise we are inviting scammers or simply people who do not have the motivation to follow through. This is partly why so many projects in the crypto and to a lesser extent, the VC space, have bad reputations. The phrase easy come easy go is a reality for young motivated people who think they can change the world, then discover it’s a lot harder than they believed.

Another way of looking at is to say that there are no shortcuts. People who want to make a real difference by building something new, need to prove themselves capable first and that’s generally done through bootstrapping, not getting a hand-out.

So I’m not to impressed with the Dev fund. Happy for Maidsafe team to have a chunk for their efforts, it’s been a long road and you guys deserve it for sure. But I worry very much for all the bad scams that will certainly come if money is handed out - no matter how hard any ‘committee’ thinks they’ve evaluated the receiver.

It’s not the same to the market. Dumping it all at once early on will drop the price down and set the level from there the beginning - it’s not the same at all as having a bonus that tapers off.

My preference here would be for a subsidy though, so don’t think I’m voting for the hodler’s to get it and dump it at the start.

I want a system that trickles it out over a century at a flat rate. If it doesn’t all ever get used because in 50 years SafeNetQuantum comes out and takes it all over, that’s fine.

I suggested what I thought might be a great mechanism for this as well, but nobody responded to it either way:

2 Likes