Like I said @mav way above my head, but maybe this helps a little.
Btw should bidding not come at an cost, I mean staking SAFEcoins?
Like I said @mav way above my head, but maybe this helps a little.
Btw should bidding not come at an cost, I mean staking SAFEcoins?
We’ve been thinking of developing some sort of game / test for the bidding idea but not sure yet about the direction. If anyone has ideas about how to gather data or test the bidding idea I’d love to hear.
Reminds me of game theory competitions in my days playing with genetic algorithms (late 80’s). I never implemented this but they were interesting to learn about. (Generic Algorithms in Search Optimisation and Machine Learning, by Goldberg has a very good section on it).
People would have different hand crafted algorithms compete in a software environment, which naturally leads to automatic optimisation and evolutionary algorithm strategies. In a bidding scenario I think this is a very suitable approach as I imagine that is exactly what would happen. Things like GAs are clearly an interesting option to try, amongst other ideas as David has already mentioned, in order to create winning strategies, or to find weaknesses in the environment itself.
I’ve long been looking for an ideal scenario to apply GAs to, and this is certainly one.
So, create an environment which can run competing algorithms within a simulation of the network environment of competing vaults. Automatically vary/evolve strategies and have them compete for rewards, or to stress the environment using a mix of different bidding strategies and ways of varying bid strategy, including self optimisation.
In my day computation was a limit. I nearly got to play with a Connection Machines parallel supercomputer to try and optimise a tricky oil industry problem, but today obviously computation has moved on and it may be feasible to simulate this on a PC.
It’s a big field, which is why I’m not sure bidding is a simplification - although as the network is an emergent system I think it is going to be complex regardless of the reward system.
gotta give a huge kudos to @oetyng and @mav for developing and thinking so deeply about this idea! Like @happybeing it really makes me think of things I studied in my youth when I was just “doing what I love” but later was like omg why did I pay so much money to study something so useless. Now I am inspired to go water that withering bush and see if it will still produce fruit!
The potentials here are exciting. Again, it only becomes possible on this unique network/vault structure, so we’re definitely in terra incognita.
Hopefully this helps. If the Network is broadly distributed in terms of farmers at launch, then this sort of collusion is less of a concern. However, it is more likely that farming will be relatively concentrated at launch. For example, what percentage of the populace would know about and be ready to farm from day one. Think about the type of person who would be primed for this and have the resources (e.g. informational, financial, etc.) to readily participate. In this, first mover advantages would actually apply and carry weight.
Say farming is relatively concentrated at launch. If groups arose that provide a significant chunk of farming, they could enforce their own rules regarding bidding. They may, for example, seek to artificially keep the reward/price low so as to deter others from entering the market. This assumes that farming increases in efficiency at scale. Many individuals would either stay out of farming because of the price suppression or join these pools because it increases their likelihood of seeing some reward. This is what I mean by colluding to keep the size of the pie small in order to have a bigger slice.
Put another way, a group of HBS students were asked whether they’d rather live in a world where they earned $100K and everyone else earned $50K, or one in which they earned $250K and everyone else earned $200K. They choose to live in the first world because we perceive wealth in (erroneously) relative terms.
Since Safecoin can also be exchange traded, how do market forces impact this thinking? Could the market (I.e. mechanism of exchange) determine Safecoin’s price while the Network need only determine the conversion rate for purchasing resources and receiving rewards? In which case, the conversion rate would be dependent on supply/demand for Network resources, which in turn can be agnostic of market price.
Otherwise put, why can’t the Network simply control the exchange rate such that the reward/price for Network resources fluctuate based on supply/demand of said resources? In such a model, human intervention (I.e. bidding) is not necessary. This of course would require setting an initial exchange rate SAFE:PUT and laying down rules for understanding Network supply and demand. You’ve done some interesting thought experiments around that like Polls: How much will you spend? How much storage do you need? etc and Exploration of a live network economy.
The allocation of rewards for providing Network resources could either be fixed (x supply guarantees y reward) or probabilistic (x supply provides z probability of receiving y reward). Although this approach could still see concentration in farming supply due to sheer economies of scale, it at least would remove the ability of individual entities to directly manipulate reward value and allocation.
I perceive there is some reticence against introducing a human political influence into the network, and even if I understand the merits of the bitcoin like automated approach, I believe we still depend on the maidsafe’s judgement, as well as anyone building the future updates. Honestly, I believe that allowing the network to interact with the humans and merge both kinds of intelligences will create a system that is much more adaptable and future proof.
I find really interesting the idea of using GET events to express the opinion of the farmers on the network, be it the rewards or other matters.
This voting system could even be useful for updating the network or making some kind of gobernance layer.
I am in love with the Tezos upgrade by consensus system so I might be biased.
I suspect that there needs to be a network determined bounds (upper and lower) if there is to be bidding. What happens when the network comes close to the coin production limit? The bidders may have some indication of this if they monitor the global supply for sure and may bid accordingly, but those that don’t choose to track this may be taken advantage of - especially if the supply is pushed hard toward the ceiling.
Hence I think some sort of hybrid approach is needed - for the sake of giving the network the most information possible but also to conservatively manage the network.
Yeah, I think one interesting result of this is that your ability to participate in voting is increased the more popular data you hold.
So, the more data you hold, and the more popular it is, the more GETs you receive, and with every GET you are able to include your votes.
So, basically, the more valuable you are to the network, the
more voting faster vote updates you get to do. Quite cool IMO. [had to edit that, to be more precise, it can be a very different thing]
Now, it is not entirely clear at the moment how valuable it is to have higher rates of voting. But one thing at least, is that you will be able to follow market sentiment better (less delay), that way having a better chance of being close to an NB when it arrives, thus getting higher rewards.
The simulation allows you to loop through the size of a section (60-120 nodes) and watch their (somewhat) random bids and rewards plotted out as (x,y)-coordinates, with x-line being the bid, and the y-line being the reward.
Remember that the Neighbour Bid (NB) is what they want to get close to, and the NB is then split up according to the reward distribution (the sum of all rewards plotted, will be the NB).
There is a slider for the NB as well.
If you want to try a steeper or flatter distribution curve, go down to Probability Density Function folder, and adjust u with the slider.
There are a couple of other bid distributions that can be used as well, where the majority go above or below NB. The one that is used has a large part centered around NB. Still quite many out to the edges though. They all deviate at most + / - 10 % from NB.
Here are some notes from when I implemented it in code:
// Sorting bids into exponentially differentiated buckets: // take diff between bid and NB // pipe through tanh (a zero centered "sigmoidal" function) // sort into buckets using PDF function // the bucket represents a share of the reward // every participant in the bucket splits the share between them // The aim of using bid-NB diffs is to equally favor closeness, regardless of sign. // The aim of piping through tanh is to map all possible bid-NB diffs into the PDF argument range. // The first aim of PDF is make reward proportional to closeness. // The second aim of PDF is to establish an exponential and continuous distribution of reward. // The aim of sharing in buckets is to keep bids from clustering. // The collective result of the above aims, are // - promotes keeping close to the common sentiment (favors passive bidders) // - promotes unique bids by decreasing the reward per bidder as bids cluster in buckets (favors active bidders) // - promotes defectors when there is collusion // -- (ie. a close participant is rewarded most, when all the others are far away) // *** // Higher rewards give more participants // but skewing highest reward away from closeness, promotes bid movement - which eventually affects NB and through that attracts or repels participants. // So.. it seems skewing is just an indirect way of directly weighting reward? // The difference is that skewing promotes those who at that time are helping the network, // while directly adjusting rewards for all, relatively, rewards those who are less aligned with network needs. // The skewing does not impact the NB as fast as the weighting does. // So maybe the best result is achieved by combining reward weight with distribution skew, // as to rapidly affect NB, as well as promote those who are aligned with network needs. // (Could the combination of the two reinforce the effects too much?) // The bucketing is more attenuated when NB is lower.
Wow, you’ve been busy @oetyng. A lot to go through here since my last post. A few comments/thoughts:
The network doesn’t need to know “why?”, it only needs to know whether the farmer resources (storage,bandwidth,latency, compute, elder counts, etc.) are increasing, decreasing, or constant/steady and what the current quantity is relative to system load or other targeted setpoints.
More is not necessarily better if it is just noise from farmers playing games. A “hard-coded” farming rate algorithm can be adaptive and flexible.
It might be fine to start with. In my view all major resource categories required to run the network should have their own reward rate. These include storage, bandwidth, latency, memory, and compute. In other words, if there is a resource proof for some farmer/vault performance trait, then the network should be offering a price for it.
True. Specifying a target growth rate from the beginning is the naive approach, but it offers a facade of predictability that is attractive to those in crypto space, and offers a simple way to motivate the network pricing algorithms. The optimal way is to have a means for objectively computing the current network growth rate, and then vary all inputs to the pricing function in real time in order to maximize growth at this instant. In the first scenario the best you will ever achieve is what you’ve selected as your setpoint, but you’ll likely fall short of it. You may not care if your goals were high enough, “shoot for the moon, at least you’ll hit the stars… etc”. In the second case, you’re adaptively determining what the absolute best is, so “hakuna matata”. Regardless, having a bidding process driven by the farmers is not the way to make any of this all work. Instead, you would want to give the bidding power to the network. The network could have a range of “ask” prices for resources, and farmers would reactively bid to accept those prices for a certain amount of network time, or leave. In a sense this is a fuzzy “take it or leave” approach.
Not true. It is biomimetic and mathematic. Consider fibonacci’s rabbits, they are a perfect analogy for section splits. It’s just what happens when you have successive binary divisions with no loss. That’s why it’s considered optimal growth in living systems. A few billion years of evolution has shown fibonacci growth to be favored for the survival living things. No need to reinvent the wheel here for synthetic life, just include it as part of the design. From my perspective a target growth rate is how SAFE establishes its own environment. We know that network growth and size it critically important to the success of the network. Some security issues that would require a lot effort to mitigate in a small network become insignificant for a large network. Specifying a targeted network growth rate from the beginning is a simple way to give purpose to all the pricing algorithms that determine PUT and GET costs. A crude analogy is the cruise control in an automobile. You set the desired speed, and the throttle is increased or decreased to match the wind load or hills you encounter.
I think that as a general rule we need to pick the right battles and always give SAFE the high ground. For example, consider two options for a perpetual auction. A) the farmers bid to determine what the farming reward should be and safe needs to give them what they ask for, or B) SAFE decides a range of prices at different volumes and the farmers bid to accept one of those of leave. For option A, no solid constraints will protect you from edge cases. In contrast, option B keeps SAFE in control while also maximizing farmer participation beyond the non-fuzzy take it or leave scenario.
Yes, see above. Non-linear controls optimization, multi-objective optimization to maximize the current growth rate or other objectives, subject to the constraint that it cannot exceed a target growth rate etc. Possible to eliminate the constraint and let the network growth rate be unbounded, but might not be prudent…
No. I just think a framework where the farmers have direct control over the pricing is not as beneficial to the network as one where the network directly controls the price.
None of those things matter with regard to the farming reward. The network can’t offer to go to one’s home and fix the computers or restore power (yet ). All it can do is raise the price it offers higher and higher to incentivize as much participation as possible. If those scenarios happen, farmers aren’t going to be sitting at their computers demanding more safecoin from the network before they come back online. They won’t be online, period. The network always has to be operating, waiting, keeping all the data safe and secure. Which is why it needs to be in direct control of pricing in coordination with all its other tasks, and the only farmer provided information it can really count on is resource availability - right now.
I think there’s some value to knowing why a node has departed. If the network is going to look after itself it could do that best with high quality communications from the participants. How that exact messaging is done, I dunno yet. Lots of options.
Should the network only value things it can measure?
This touches on a very important point - promises. Bitcoin promises digital scarcity (in this case 21M coins max but that’s just an implementation detail). Basically everything else in the design of bitcoin stems from the promise of digital scarcity. That’s their core unique offering. The implementation of difficulty adjustment periods, mining, block times, fee market etc all exist only because of the scarcity promise.
What promises should SAFE be making? To my thinking the key promise is Perpetual Data. That’s unique to SAFE. Nothing else offers that. So the economy should be designed to give confidence to that feature. This matters because a fixed growth rate of resources is probably a stronger promise for the goal of perpetual data than a variable growth rate. I think fixed growth rate probably gives sub-optimal growth, but it does increase confidence in the promise.
Digital scarcity is another promise being made by SAFE. Is there a potential conflict between these two promises? How can we address that? Who decides?
On the topic of PAC, the promises become … weaker? stronger? It’s a really hard question to answer.
I don’t use storj or IPFS because the promise of data retention is too weak. The growth of SAFE is going to be very strongly tied to the promises it chooses to make.
I think it’s a good idea for us (both sides of the debate) to establish
The simple argument I would start with is data is growing exponentially, not fibonacci. So why use fibonacci growth for the network?
Just testing the waters here, should people decide the growth rate or the network? Maybe another way to ask the same question is what’s more important, cheap abundant storage or a predictable growth rate?
What are the edge cases? Genuine question.
I feel a dystopia meme is needed here…
I don’t think having the network in control is necessarily better. If the world wants to migrate to SAFE asap the network should not be able to say ‘wait a sec’.
A fixed algorithm is necessarily exclusive rather than inclusive. I lean toward inclusive every time. Yeah we’ll have to include the malicious people but I accept that (kinda the point of SAFE isn’t it).
Which framework is more beneficial to the end users? A fixed algorithm or bidding? Really tough question I know, because it’s about security as well as growth, so maybe we should also explore how fast can the network grow before it becomes unsecure growth? Is slow growth more secure than fast growth? Is growth correlated to security at all? Why is fixed growth desired? This is a big zoom-out on the topic but I think it’s needed. Maybe I’ll expand on this later.
I don’t want to benefit the network, I want to benefit users. They feed into each other but in the end I have confidence that users are always in a better position to address their problems than the network is. Why do users start using the network in the first place? As a way to address their problems. The network is for the users, not the other way around.
Hopefully this is a coherent response but I’ll have a deeper think about it and come back to you with some more strongly distilled ideas
Just a comment on this.
Ultimately we would still be in control because we update the s/w and can change the algo through updates.
Actually the network needs to be able to say “wait a sec” because too fast an influx could make the network unstable or attacked easier. I understand you were meaning something slightly different but yes the network needs a measure of control over rates of increase etc.
Growth curves are particularly well understood in biology population stats … and probably a lot of related fields. I wonder if it possible for the network to use some sort of curve fitting algorithm to determine aproximately where on a standard growth curve the network is, and then use that information as part of this farming reward determination mechanism?
As opposed to just using a fixed curve assumption.
E.g. taking the data up to “now” and then fitting it to a standard growth curve.
That information is being piped through the chosen economic model. So if you want the network to know the details of exactly why a node leaves, then you need to have a detailed pricing structure for each characteristic (storage, bandwidth, compute, etc.)
Two types of information are available to the network, explicit and implicit. The explicit is that which directly measurable or given a metric, or dictated in code. These are easy to work into a pricing structure. The second are implicit relationships or meta data inferred from the explicitly measured information. These are harder to quantify. I think it’s ok to say that if it can’t be numerically measured, then it doesn’t exist as far as the network is concerned. At least for starters…
Your analogy to Bitcoin was spot on here. The digital scarsity, the halvening, these are driving functions where Bitcoin has set the stage and created the environment within which market participants interact.
This is exactly what I was getting at, but you said it better. Picking a target growth rate is sub optimal, but it is transparent and easy for the layman to understand, like “the halvening”.
If you recall I said “pick your poison”. Fibonacci seams reasonable, but I’m not married to it. Continuous optimization for max growth is nice, but it needs to be properly constrained. Harder to explain and not as transparent for a promise. Could be with the right marketing. The problem is that it is completely unpredictable. Setting a target growth rate that the control algorithms continuously push and pull towards is more conservative and builds trust.
Again, pick your poison. (Linear, hyperbolic, geometric, Fibonacci, exponential etc.) An exponential growth rate has interesting properties as well, but could be a hard master to satisfy. I suspect (pure conjecture) the unconstrained optimization scenario I mentioned would probably end up giving exponential or double exponential growth… at least over short periods. The point of the argument is not what the target growth curve is, but that it might be good to design the economic model around one.
An analogy is like asking the question,'should we decide what speed to set the cruise control on when driving from point a to b down the highway?" If your answer is yes, then you have a very good idea how long it will take to reach your destination. My hypothesis is that a predictable growth rate will ensure cheap abundant storage.
The first might be overblown in this topic, the second not so much.
I disagree. The network has a responsibility to the data it contains first, and the whims of the world second. The network needs to be able to say ‘wait a sec’ if that is what it needs to do in order to keep all the data safe and secure.
If by users you mean clients, then a ‘fixed’ algorithm will offer the lowest PUT price.
That’s why you specify a target growth rate from the beginning to constrain the optimization.
No. I think you need a middle ground that is “more than fast enough”.
Yes, absolutely. Grow, fast enough, or die.
Not really. It is a symbiotic relationship. Or more accurately described as a cybernetic symbiosis. Users and farmers are at different ends of the supply chain and the network is the middle man to end all middle men. The farmers feed the network with a flow of resources, the devs feed it capabilities, the producers feed it content. The network feeds those resources, capabilities, and content to the clients. Safecoin flows in reverse. The network is the market maker and taker and the only entity that observes both sides of all transactions, in addition to all other network conditions. For these reasons it needs to be the ultimate authority on PUT and GET prices to ensure it’s own survival in the market.
Sadly, I’m a bit busy these days and I don’t have time to read everything so I’m not sure if this was addressed yet, but my impression is that “bidding” as a category is a viable method when one side is buying something from another side.
In our case however, we’d have to trust a possibly incomparably stronger side with deciding how much “charity” to hand out, not even in exchange of something but as a nominal “thank you” for a GET that’s already fulfilled.
I’m a bit busy as well, have a few things to respond to in this topic
But, I wanted to just say shortly, that “bidding” is not a term I consider 100% accurate for what we’re doing here. This is a new type of interaction, in a new type of environment. For that reason, I don’t think you can say it is more or less viable based on what it has been used for previously.
When repurposing something, or inventing, you just find the way that it could work in the new setting, and it becomes a new thing.
I would call this phenomenon more of an “estimation”. What is it that is being estimated? We are estimating what we believe to be everyone else’s belief on what everyone else believe. It is very similar to a Keynesian beauty contest.
And so, really, this is more of a contest, than an auction, and as you might well know, there are no limits to what games we can create.
I feel that kind of mindset is more powerful when we try find new ways in a new system and concept.
With bitcoin there’s a property of it being ‘only money’. So people who buy and sell on exchanges and never touch the blockchain are still ‘doing bitcoin’ (for their purposes). Bitcoin would be turning away unbelievable amounts of people if it wasn’t for the possibility of offchain activity.
But SAFE is not ‘only money’, it’s data, and we can’t really expect that to move offchain like with bitcoin. That’s the whole point of the network, to put everything onchain, and to really suck it all in, not leave any reason to stay on clearnet.
If growth is predefined (even flexibly predefined) there’s a chance SAFE will not become a storage layer but just a coordination layer. It will be too time consuming or expensive to get data on the network so people will use it mainly for coordinating direct data transfer between each other.
Like bitcoin hash rate? There’s no problem there, so why for us?
This is the bitcoin mining growth curve (source):
I think this curve is a) incredibly difficult to predict if you’re in 2009, both the shape and the magnitude and b) indicative of possible growth in SAFE (ie uploads, downloads, storage, bandwidth).
Right… we can’t know or agree on the growth beforehand, so let’s design around predetermined growth. Sounds a bit paradoxical.
My hypothesis is that a floating growth rate will ensure cheap abundant storage. We are at an impasse…
A controversial way of framing the network-as-an-actor is, what if MaidSafe prefarmed all 2^32 coins and handed them out in some specific way to network participants. Why is replacing ‘MaidSafe’ with ‘The Network’ a better result?
Predictable may be a poor choice of wording, but it is somewhat accurate. A better term is “target” growth rate. Much like how bitcoin determines the difficulty, there is flexibility in looking at current conditions over a certain period (ex. a number of PUTS to represent a duration of “network time”) and adjusting the target growth over the next period based on these or longer term observations. In the next period (which could span weeks, months, or many years ) you use these targets to drive the control algorithms that adjust prices/rewards. The growth becomes predictable to the extent these controls are effective. I agree with you that it would be extremely difficult, if not impossible, to predict that curve in 2009. However, this unpredictability is a feature of the BTC mining algorithm, and so the same unpredictability is not necessarily SAFE’s destiny.
There is a big difference between SAFE and Bitcoin with regard to mining/farming control algorithms. Bitcoin is analogous to a rudimentary “open loop” control system in that it adjusts the mining difficulty according to the hashrate to maintain a predictable/target rate of coin discovery, but ignores all other factors. SAFE could be designed to operate the same way with a predictable rate of coin transfer to the farmers. That option is pretty boring and low performing. With SAFE we have the opportunity to form a far more powerful “closed loop” self-exciting/self-inhibiting control system. There are a lot more levers to pull with regard to PtF, PtD, PtP, PtC, GET and PUT rates. And these offer serious potential for maximizing growth in a way BTC or any other project never could. When designing these systems, decisions need to be made as to what the objectives are, and what the constraints are. Otherwise you end up needing 10,000 monkeys with typewriters and a lot of time on your hands.
BTC was given a prime directive of predictable coin release rate far into the future. All other BTC network properties such as fiat price, hash rate, difficulty etc. were either intentionally designed to enforce this predictability or emerge as a result of it. In my opinion, maximum network growth of SAFE is the objective we need to be looking at, with control algorithms that adjust reward or cost rates accordingly. In this scenario SAFE becomes an intelligent agent that is capable of self-regulation.
Below is a modified version of the image you posed above. I used your description of BTC hashrate as an example growth metric. Rather than use wallclock time the x axis is BTC transaction count as a proxy for “network time”. An exponential curve (green) and a fibonacci curve (orange) have been included as example target growth rates.
The BTC growth curve in blue shows a network controller having difficulty maintaining it’s growth target. Under this scenario, a hypothetical control system for SAFE would be pushing the economics to follow the target curve. Below 3e8 transactions, the network would have benefited greatly had the faulty controller offered steady stimulus. From 3e8 to 3.5e8 transactions things are getting too hot and the network controller should have been limiting growth to build up its reserves. From about 3.5e8 transitions onward it should be pedal to the metal since growth is faltering.
In the interest of time this is a rather simple example. As @TylerAbeoJordan intuited above, it’s far better to fit targets and adjust controls incrementally over shorter periods of “network time” to improve adaptability. The chosen duration of the period can also be adaptive and determined by the network. “Optimal Control” is a well studied field in academia.
Or can be used elliot wave prediction scheme ?
Bitcoins looks like this: (full screen picture)
At the risk of sounding arrogant while simply stating the obvious: in this domain, if we include the need for prediction or forecast, even if in a loose sense, we are doomed to failure.
As @mav already noted, bitcoin’s adoption curve was completely unforeseeable. Nobody can predict if its price will double next week and nobody can reliably ascertain how much more mining that would attract and how quickly. It’s plain impossible because even if we can predict 99.9% of the price moves, the 0.1% biggest ones (that we can’t predict) will be more consequential than the rest.
So, we may as well not waste time on something that’s impossible and instead go into a direction that can at least theoretically work: reacting to changes in supply and demand in a way that would constrain the network to stay within a healthy range of parameters. I already mentioned something like this here:
Let’s also divine Saturn’s influence for good measure
Have you seen proof (a record of trades) from anybody that they made money using that method, reliably, time and time again, year in and year out? If not, you have no reason to believe there’s any merit to Elliot’s idea. Basically, he was just another guy who thought he could find more information in the signal than there was really there.