SAFE Storage economics - one-time fee, forever service

Thank you happybeing!

then, my initial question and concern are still valid. How is the network going to pay for the ongoing cost of maintaining all this data? The conomics of this look a lot like a pyramid scheme, or a Detroit car manufacturer’s pension plan…

Don’t get me wrong, I love the MaidSafe idea, and I am an investor. I do want this thing to succeed. I just see this one point as a potential fatal flaw, that seems relatively easy to fix.

1 Like

These costs reduce exponentially, so the cost curve is an asymptote. In other words, there’s a max cost to storing a chunk “infinitely”.

I understand your concern and I admit that I can’t provide you with proof that it isn’t a real problem. My confidence has been gleaned over 18 months of discussion on the forum and a good deal of reading before that, but I could be wrong certainly. This is stuff that has never been done before so it is a gamble - but on the other hand, everything is changing right now, so even the status quo is IMO a gamble. Anyway…

The main reasons though I can summarise, and it might point you to further research as these topics have been discussed a lot on the forum:

  • storage technology has continued to advance at an exponential (or similar) rate for decades and we can expect that to continue over the longer term, though perhaps with pauses and bursts of course.
  • de-duplication means that we expect the network to end up with a surplus of payments for people who are ultimately storing the same data. For example, imagine backing up your whole system: all those operating system files from Windows and Linux and Mac users - many terrabytes from millions of users being stored just once (well four to six times per file in practice). Now all those those CDs and DVDs we purchased separately, or films people downloaded multiple times. Everyone pays for their storage, but the costs to the network flatten out once the first copy is stored.

Hope that helps. :slightly_smiling:


wasn’t the idea, just to let the first uploader pay for data, that gets deduplicated?

That was floated and I think @dirvine liked the idea, but it isn’t my understanding that this is how it will be implemented. I’m not certain though.

This thread might also be helpful.

No, that was rejected because it then allowed people to gain knowledge of what others have uploaded. While that knowledge is small it can still be damaging. Every bit of knowledge you allow to be gained from “meta” information can build profiles etc.

Thank you, dyamanaka,

I read through the thread, and it did not change my mind one bit. In fact, many posters who share my opinion on this topic came up with additional arguments that I had not yet fully considered. Anyway, if the cost to the user does not reflect the true cost to the network, then the network is doomed. It’s like price controls in old USSR: you can pretend that a stick of butter only costs 10 cents, but if it really costs $1 to produce, then there simply won’t be any butter for sale, and people will be spending their life standing in line in front of empty stores.

It’s a real shame that such a good concept be doomed in this way. It will take a fork, or a rewrite (maybe using something like Ethereum or Tendermint as a management and payment platform) to get it right…

1 Like

This is not entirely true and the beauty of the system. Some do not consider all the dynamics at once. I have done some pre-lim simulations many months ago that show that unlike fiat the system balances itself.

For instance one dynamic is that as coins are given for farming, they will find their way into 2 piles. One is “keep it for later” (hoarding) and the other is spending it on puts in the near future (even if it changes hands at an exchange).

Another is that not all puts ever result in more than one get. It could be shown that a good proportion of current immutable storage is backups, that is never accessed. Or private files rarely accessed. movies that are accessed often will end up having most accesses satisfied by caching (no vault gets). And who watches old movies, like Leslie Nelson’s “Forbidden Planet”, or any other old movie? Very few are really.

Another is dedup. That new popular video may get uploaded 100’s or 1000’s of times but only one store actually occurs, so that popular vid really made the network 100’s or 1000’s times the coins it would normally take to store that amount of data.

Another is that storage cost is halving every 18 months (actually 10 times in 5 years) for the last 3 decades. And solid state storage is about to make that doubling every year or so. And typically data becomes less used over time. So we have a reducing cost to pay out of each chunk stored.

Another is that the buffer the coin supply provides to allow time for coins farmed to be spent and for hoarded coins to be eventually used.

Another is that as issued coins increase so the coin issuance success rate reduces on farming attempts.

As farming attempts success rate drop so will the fiat value of the coin rise. So then does the value of the farmed coins. It is expected that this will at least make up for the reduced coin issuance but experience shows that typically the fiat value will rise faster than the farming scarcity does.

And a few more dynamics are there too.

As you can see it is a very dynamic economic system and a number of balancing effects occur that cannot/does not exist in that USSR example. Which BTW did not occur as simple as that. It still only cost the same to make as sold since everything else was keep the same, and not the real reason their economics failed.

Obviously the algorithms for putcost and farming rate have to reasonable, they don’t have to be perfect


do you by any chance know, what is going to happen if users send data to each other (as shown in the very early “lifestuff”-demo-video)?

Also lets not forget SAFE coin value :slight_smile:

If you want to send a video to a friend then you send the datamap (or link to the datamap) of the video. Cannot copy a file to another person’s account, the copy is the datamap, the chunks are not put again.

Messages use the SD objects for small messages and datamap to large data/messages. SDs cost nothing to rewrite.

Also I believe that archive nodes need to be figured out before we can claim long-term viability of the Network…


so, just to clarify this, practically giving someone access to a file won’t cost additional safecoins? neither the sender, nor the receiver.

1 Like

Only the very first time you send messages. And that is to create the messaging SD for your inbox/outbox.

It will cost 1 put value. Only need a coin if your account_balance of puts drop to zero.

So if you regularly send datamaps then typically it will cost no puts or coins. (only the first message ever)

Now if you bundled up a number of datamaps into a file and stored them you pay for the put cost of the chunks needed to do that and you send the datamap for that file for free.

Have I confused you further?

tl;dr send datamaps via messaging is free


How effective do you think that would end up being? (just looking for you to expand on what you’re thinking about here)

I am thinking of a library of say old movies I love. Or books, Or cat vids

And I didn’t want to spend the time to send 100 messages. It might cost 0.0001 dollars to make that library. Maybe only 1000th the cost too

1 Like

Convenience fee. Got it.

1 Like

But thinking about it, its really just a sub-directory rather than a file with a datamap of the subdirectory. I gather that is costing too to make directory structures. About the same as a file.

The advantage of the library of datamaps stored in a file is that I can share just the library without worrying if I added a file I don’t want to share with a friend.

But a disadvantage is that if it’s ImmutableData then you can’t remove one if it’s outdated or you otherwise don’t wish to share it in the future.

Also, I remember that @dirvine said that it would not cost anything to rearrange files in or out of a directory structure as long as they are on the Network already. He didn’t say how in technical terms, but that was the goal.

1 Like