An alternative economic system for the SAFE network

I’ve been thinking about the incentive structure and economics behind the SAFE network, and how it is powered by continuous NEW data being added to pay for existing data upkeep. And it all just seems a little like a pyramid to me, a little unsettling to me. My friend agreed and thought of a suggestion for how the SAFE network COULD be structured, economics wise.

The following is my friend’s proposal for how the SAFE Network’s economic model should function. The network purposefully charges for GET requests to avoid perverse incentives, ie, the cost of the network bandwidth will not be spread across all data equally or paid through upload costs. It also allows for unwanted data to be removed from the network in a situation where there is both an absence of demand for the resource and no-one willing to subsidise continued availability. This looks to be a much more efficient and sustainable economic model which allows SAFE to retain many of it’s technical and design advantages while creating a more direct and sensible market for resources between users, uploaders and farmers.

Farmers set two fees rates. The first is the KEEP Fee; it pays for for availability, uptime and storage costs. KEEP Fees are paid directly to the network and are managed by the network, they are spent as the farmers “wage” to keep farmers reliably online and keep data “alive” (available, stored). The second fee is a GET Fee, it is paid by users to farmers (in a semi-direct manner) in exchange for bandwidth and etcetera. They’re initially paid to the network and only withheld if farmers refuse to allow access to data, do not provide the requested bandwidth or provide unacceptable quality of bandwidth. There can be a GET Fee market where users can pay for higher bandwidth (ie, download speeds).
When fetching data, users pay their GET Fee to farmers, plus an attached KEEP Fee. The KEEP fees that are paid to the network will accumulate over time and will be be released slowly by the network to farmers as according to the fee rate that they have set. As a result of the gradual release of the buildup KEEP Fees to the farmers, they will subsidise both current and future demand for data availability.
When data is no longer requested from the network, the built up KEEP Fees that were paid to the network will be slowly expended, thus there will come a point when farmers are no longer paid to keep data available. When this occurs, there will be data “death” (deletion by farmers).
Uploaders will pay a modified GET fee for the bandwidth required to upload data, plus a larger KEEP fee to cover initial write costs and immediate data availability. They can choose to pay a KEEP fee as large as they want, paying a large KEEP fee will keep the data available for a prolonged period of time in absence of any GET requests and is thus useful for long-term personal file storage. Finally, note that anyone can donate to contribute extra KEEP fees to the network for any given resource.

A lot of specifics are glossed over here (for example, we may want to hide total sum of KEEP Fees paid to the network for any given resource unknown to farmers), however most of the considerations not pointed out here should not be any different to the current way in which the SAFE Network is meant to do things.

This would allow for data that is truly useless to not be stored on the network, making it more efficient. As it doesn’t have to be perpetually up. And it also allows data that people want to access to stay up, meaning it is censorship resistant.

I don’t know if there are any specific reasons to have the safe network the way it is, so this is just a couple thoughts.


There is something to be said about data being worth keeping, even if no one accesses it for X number of years or decades. Take a look at the Internet Archive project. Are many of the things it has archived necessary and has everything stored been accessed at some point or another? I highly doubt it. However, to some extent, this is our recorded history; this is an account of who we are at this point of time in civilization, for good or bad. Our data tells a story which goes far beyond the associated costs we may pay to upkeep it. I personally think it is a necessity to maintain this data if at all possible.

As far as the economics goes, I don’t see a problem with the current model, however nebulous it is at the moment. Storage costs will continue to shrink, more devices will continue to come online, and people still have to pay to upload data. If a lack of storage becomes an issue, prices will go up, more farmers will come online, and it will return to equilibrium.

It seems you may have a concern for farmers staying online. They will continue to get paid as the section takes on more data, and if the section gets too full, it will split, and more section storage space will become available. There should be little time where farmers aren’t capable of receiving any kickback.


My concern is that the cost of upkeeping existing data is a continuous one, while the cost of uploading it is a one time deal. Which means that for the network to be running smoothly, you’d need an ever growing amount of new data being added constantly for ever. Which is a fair enough assumption based on the current trends, but it feels very unsettling, because it isn’t a one-to-one relationship and feels very pyramid-y.

Also, lets say you upload something and then decide its dumb and don’t want it there. In my friend’s described model, it will soon go away and stop existing, because only I pay to upkeep this data. Its good for me, because dumb stuff isn’t out there about me, and its good for the network, because it isn’t congested with data no one will use

And if someone out there wants to pay archiving costs, they still can

It isn’t necessarily the lack of storage for NEW stuff, more-so the lack of storage for OLD stuff

1 Like

It’s great to hear of alternative economic ideas for the Network, so thanks for taking the time to think this all through.

What I would say though, is that there are fundamentals the Network that make it what it is. The economy is designed around supporting these fundamentals not the other way around.

There are a couple of aspects to this proposal that fail those fundamentals and therefore make it something else; maybe something interesting, but not the Safe Network. Thats:

  1. Perpetual public data. How we make sure public data is always accessible, regardless of how valuable it seems at the time it’s uploaded, or how often it is requested.
  2. Data being accessible to anyone anywhere. So that means regardless of their financial means, or their ability to access Safe Network Tokens.

BTW, I’m not saying here that the fundamentals can’t/shouldn’t be debated—that’s all good—it’s just that I don’t think it’s good to start form an economic standpoint to argue that, but that they should be debated as a separate objective, good or bad, and then the economic design should be made around supporting them.


I mean I would think that the things that make SAFE what it is:

  1. Censorship resistance
  2. Enhanced privacy
  3. A new paradigm of web applications: DAPPS

These aren’t goals but rather things I enjoy about the network and what makes it sound good to me.

And I actually do have some problems with perpetual data. Namely, data that literally no one accesses would needlessly congest the network, and also, data that I personally don’t wish to exist anymore that I uploaded, would still exist.

Perhaps what I’m describing isn’t the SAFE network


We do have a handy list of the fundamentals and objectives here :grinning:


It is currently proposed that private data can be deleted. Public/published data cannot be deleted though, as once it is in the wild, it is no longer yours. It becomes data shared by all network users.


The other thing to remember here is even for private data, your yourself might not know what is valuable until years down the line.

Sure, give people the option to archive, or if they must delete, but building it in by necessity to the model—via storeage quotas, on going costs—effectively forcing deletes seems a shame.

Also, when it comes to public data, any archeologist will tell you the merits of preserving for future generations, the throw away data of today… and also that the items that are never accessed are often the most prized.


There are also plenty of pay per this or that options out there. Personally, I think the pay once model and free reads are fundamental to the core objectives.

I feel that too much emphasis is placed on the need for incentives too. Many p2p systems survive on little or no incentives to varying degrees. Some folks just like to help out. If they get a few quid a month out of it too, it’s a bonus, but may not be their driver.

For big farming setups, clearly it has to be profitable. I think it will do them no harm to compete with spare resources of hobbyists though!


There’s a dual misconception at the root of our intuitive idea that storing data forever isn’t sustainable IMO.

Jim has mentioned that we undervalue data, and would delete things that later would be found valuable - the are lots of examples of this. We cannot know what to keep and what to delete over a timescale of a few years, let alone forever.

The second point is that we overestimate the cost of storing this. The reasons for this is that storage costs decline rapidly over time, meaning that what seems costly to store today rapidly reduces in cost the longer we keep it.


It’s also true that the size of storage required for a particular file can reduce over time too.

And then, the notion that providing perpetual storage, with a specified amount of redundancy, might reduce the requirement for multiple duplicates offline, or in other services/locations too.

For example, I have my photos backed up to an online service I pay monthly for (It’s not cheap) plus I have several backups on hard drives stored at home (and should really have them at another location too). If I could have reliable perpetual storage, it would be a no brainer for me to switch from the monthly payment, and I’d likely free up several hard-drives at home too.


Can you expand on that, please Jim?
Do you mean that there will be fewer copies stored or that new more efficient compression algorithms can be expected to be available?
Or am I missing something obvious?

Yeah I was thinking this. Plus more efficient redundancy schemes etc.

1 Like

There are things like temp files used in word processors. Not needed once the file is saved.

There are work in progress files that typically deleted once the work reaches a certain point since all the work in progress files data is contained in the “final” file. Its upto the person to decide and that is one of the main reason we need to allow private data to be capable of being deleted. It is truly wasted space.

Public data of course can become valuable years later, like a song written 10 years ago gets discovered and becomes a hit. Not valuable for 10 years and then suddenly a million views per day.


From what people have been saying perpetual data is a philosophical commitment, not a technical necessity. If that’s true and if SAFE gets popular it would probably by forked to a different economic model. I share your scepticism about the planned economic model, not because it won’t work, but because it reduces the potential size of the network and potentially leaves it open to a fork.

Any network has two forces pulling network participants in opposite directions:

  1. The size of the network effect pulling in.
  2. Forks or other alternatives pulling away.

The danger SAFE faces, I think, is that it may be forked to a less opinionated vision. This means that application developers, users and farmers are able to make decisions about perpetual data and free access themselves. If network participates can make more decisions themselves, then are more likely to join. On the other hand, perhaps the first mover advantage means that SAFE can successful defend against these fork attacks through a superior network effect.

We can argue about perpetual data and free access all day long, but we don’t decide wants happens to the (open source)code and how its used. If most people want perpetual data and free access then fair enough, but if most people don’t then SAFE may be forked into a new network without it.


Just to note, forking a storage network is not the same as or as easy as forking a blockchain (or an app etc). The larger Safe becomes as a source of valuable data the harder it becomes to fork because of the amount of data needed to be replicated.


I raised (and I am sure many others raised) similar concerns 1-2 years ago. Altought they weren’t received warmly the good news is that when we have the “main” network working (elders + payments) we can build a new layer on top of it with different/additional functionality like the “KEEP” fees that you describe.

1 Like

It isn’t only a philosophical commitment. Technically, it would be challenging to change this behaviour too.

To elucidate, the Safe Network doesn’t really store ‘files’, it stores ‘chunks’. Different files may be composed of the same chunks. These files may be uploaded by different people at different times. This is the basis of de-duplication, which is a natural benefit of chunking. As there can be many files sharing the same chunk, it also removes the direct link between chunk hosting and file hosting. In short, you can’t resolve which file is being stored or hosted based on the chunk alone.

Therefore, chunks must be immutable. The design requires it. If you start restricting access to chunks, then various files will potentially become inaccessible, which may be owned by different users. As the client retrieves the chunks directly and re-assembles them into the file it desires, the hosts can’t restrict on a file basis either. This is by design, for security and anonymity.

As soon as you start trying to rent data storage, chunking must go out the window. Hosts would also then know what files are being stored, removing plausible deniability for the host and opening the doorway to censorship. Likewise, the client file (not just chunk) requests could be tracked, removing their plausible deniability.

So yes, someone could create a fork but it would be completely different from the Safe Network if they want to start changing rental fees for data storage.


It is and projects need to have such deep commitments and make them clear. The technology builds around those commitments in hopefully the most efficient and simple way. It’s like building a car for folk to fork it to be a boat, it won’t be easy, but with enough effort everything is possible.

Nobody needs to fork Safe to have a pay as you go, there is storj, filecoin and others who do that. I am not sure of all their fundamentals, but on this point Safe differs for sure.


Perpetual data is a technical necessity.

The only way to promise ‘you will own your data’ is for it to be perpetual. If the network is deciding whether the data lives or dies then it’s not yours any more.

If farmers are able to make decisions about data, do uploaders still own the data, or do farmers?

This is a great document and I know many will skip the click so I’m going to paste some relevant quotes from it here.

The very first objective is

Allow anyone to have unrestricted access to public data: all of humanity’s information, available to all of humanity.

It’s going to be hard to have pay-for-GET with such a clear opening statement like this.

Number 8 says

Store data in perpetuity

All public/published data on the Network will be immutable and available on the Network in perpetuity. In exactly the same way as the Internet Archive stores versions of website that were published with mistakes, it will be impossible to delete any data from the Network after it has been uploaded. That does not mean that you won’t be able to change data - you will be able to make append-only changes, i.e. historic, earlier versions of data will always remain stored on the Network (whether they are accessible or not).

Number 15 says

Not have servers

The Safe Network will never rely on servers (as the term is traditionally understood) as to do so introduces a third party weakness that undermines the entire Network.

I’m cautious not to become a fundamentalist (!) but if we look too much at the ‘transacting’ part of the economy we’ll be missing a huge part of what makes the network valuable.