Safenetwork sustainability concerns - Bandwidth has an ongoing cost however Safenetwork is a pay once, benefit forever model

Fair enough.

I still feel that the only thing that could be *needed* is the temp storage option when uploading. That way the person decides what private data is temporary. Personally, based on history, industry indicators, research projects, I believe that pay once and storing forever is sustainable for all the above reasons I gave and the delete is unneeded.

I also feel that rewards for attracting various groups to the network and to encourage them to store data/files/comments/etc is very appropriate. Like paying app developers when people use their Apps, and paying content providers for uploading good content that people want to view/download, and paying core code developers are possible and sustainable. They are a fraction of the amounts farmers get and dependent upon people using their individual contributions, so the better the contributions they make the more rewards.

And of course their rewards will be more scarce as the available coin becomes more scarce and as we’ve seen when a desirable thing becomes scarce then its price goes up. So even though at times the rewards become scarcer the actual monetary doesn’t reduce, and hopefully the same will happen to safecoin.

EDIT: I also see the payment for uploads is not paying for he initial storage but paying for the lifetime storage. The actual average incremental costs for farmers to store a chunk will be reducing over time and the coin you paid for uploading is used say 1/2 for the first year 1/4 for the second, 1/8 for the third and so on. So in effect there is always a little to pay the next year. Its the old discussion maths problem of a infinitely small frog that jumps half the distance to wall and that frog will never reach it. So too the coin used for uploading is never fully used in actual incremental costs. It might not be a reduction of a half each year for storage/bandwidth costs but it will be close to that and could even be better than that on average.

3 Likes

Ok maybe I’ve overstated a bit in my last post I do admit but MOST of the time you’ve just been defending the current state of the network. And I just think honestly it definitely can be improved if we tried and got out of the thinking that the current state is perfect and need not change.

1 Like

OHHHH I just thought of another way. What if you made it such that when you pay to store things, safe coin enters a smart contract. And it get distributed like this, let’s say one safe coin is used. First year it gives the farmers 0.5 coins. Second year 0.25 coins. Third year 0.125 coins and so on. So every single year there will be some coins. It’s almost like bitcoin halving

(I made a edit to my previous post)

Also we’ve had 3 years in the community talking about these issues and thrown back and forth many ideas along these lines and rental was not favoured are being a good option. This is one reason why the huge negativity by a few people towards the idea. Its an old one and discussed previously and found wanting.

But yes we need to consider new ideas, but remember the KISS principle. Keeping it simple has many benefits. So in general (yes not always but generally)

  • people understand it better
  • thus people are quicker to adopt it and remain
  • the code is much simplier
  • thus a smaller attack surface and this is so important
  • less opportunity to game the system, or said another way it is easier to make less gamable. More complexity allows more edge cases to appear and multiples the complexity to plug the holes.

Even your desire to reduce the initial upload costs for rental recognises that the costs to store for the initial time is not the cost that is charged. And so the costs for uploads includes the costs for forever storage and that incremental costs for storage reduces over time. Thus charging the correct amount up front covers the lot. And a pay once store forever model takes the worry out from having to maintain the data you stored and it becomes more like I stored my data on my hard drive and don’t have to pay rental for the data on my harddrive. But its even better than owning your harddrive because it survives longer than a harddrive.

And of course the cost to store that image you upload is going to be very very small, in terms of micropayments. Some estimate it will be around the cost of storing an image on a many TB drive. 1/4000000 of the cost of the $200 drive. Like 0.005 cents. And this cost will reduce year by year.

This assumes those farmers who farmed back then are still around to get the 1/4s 1/8s

The farming reward algorithm in effect does this already. Just does it in an averaged way and pays up front. If the farmer continues then they get more rewards as they serve up data

This also saves on complexity and “smart contracts” etc

8 Likes

Ok thanks for explaining a bit. I was quite bewildered to see some responses so far actually from the community. From my perspective it was simply bringing up a potential problem. If someone thinks it’s not a problem, simply defeat all my arguments there is no need to start calling me trolling etc…

Well, another way is you can ask people to pay 10% upfront for initial costs then 1% every month. Still will be preferred over only giving one option to pay 100% and store forever by some people I’m sure.

Another thing is and this is basically my argument anyway. Which is well what about all the costs of bandwidth ongoing? So the cost of bandwidth ongoing is not a huge concern but the initial cost is? You can also say the initial cost is then outweighed by all the recurring payments that’s gonna come in later. So farmer may initially suffer a little. But then benefit more as time goes on.

With the current model. Farmers initially benefit a lot. But as the network is free to browse and farmers have to pay to give bandwidth to people overtime. They may not benefit. So actually thinking about it, these two complement each other very very well!

No I meant it as

  • today it might cost (all inclusive cost) x amount to incrementally handle one chunk
  • Next year it is likely (averaging every farmer obviously) to be 1/2 x
  • year after 1/4 x

Now the 1/2 might be 40% of x or 60% of x and vary slightly from year to year. And next year or two might see a 10 times reduction in the cost of SSD storage, so sometimes the reduction could even be much greater.

Now I didn’t include the statistic that says for an average file that as it gets older it is accessed a lot less year by year. Now this isn’t for OS files but files people store. Think of those dvd movies you or someone you know has bought. The first 3 months you might watch it a few times, then the next 9 months you watch it again less than you did in the first 3 months. Then the second year you watch it a lot less if at all.

The point being that averaged across all chunks the older the chunk is the much less its read. This impacts bandwidth in a way that sees new data consuming most of the usually free bandwidth costs. So even for those who pay incrementally for bandwidth those cost dramatically reduce for a particular chunk over time.

So if most farmers due to economics are on unlimited (or extremely high quotas) and their bandwidth cost is insignificant or zero and only a few are paying incrementally then dramatically reducing the very small overall bandwidth cost for a chunk over time is extremely important. Important in the sense that rental needs to charge 80% of the normal cost up front just to cover the the statistically shown usage of the upload for bandwidth portion of the costs. In other words its just not worth the effort and the complexity of a rental system.

Also if you look at industry and the consumer rental system for electrical/electronics, you see that the actual rental charged ends up so much more than paying up front. And due to the complexity of introducing rental into the code will cause a lot more processing and bandwidth and disk activity that the actual cost to store rental data could even be twice the cost of pay once data. Pay once sees disk activity to just store and whenever its retrieved. Rental sees daily checks if the data is to expire (daily disk activity for each chunk) and bandwidth usage for every node in the section to agree on whether to delete or not.

The point is that rental has its overheads that must be charged to the renter. For the computer industry it is like 40% break even then rental firm charges a profit on top. I think from a guesstimate that the network would have to be charging something similar for the first couple of years.

Rental is a can of worms (coding, user experience, costs) that really makes it a cure that is worse than the perceived problem.

I honestly think you way overestimate this cost. By a lot.

Anyhow farmers are paid to retrieve data so that covers those costs even if they are small (or zero for most)

3 Likes

Well depends on how you code it. It can check monthly too. I don’t know how long it takes to program but I think it’s good to have an option where you pay less upfront but ongoing. It’s similar to phone plans. A lot of people choose them over paying 100% upfront. So yeah… And remember there will be people paying upfront too. So basically the resources and initial cost needed by the people who choose rental will be complemented by people who pay upfront for forever data…that’s what I mean by they compliment each other very nicely. Then overtime recurring fee will help the network expand faster than simply relying on new data being stored.

So basically, now you have BOTH the existing stored data AND newly stored data returning coins to the network hence paying the farmers as opposed to only newly stored data.

That means some people would get 59 days rental instead of 30. Remember you are making every single chunk stored on the network subject to this rental. Every MD subject to this rental. What of the xyz tokens worth hundreds of dollars I sent you but you were unaware till I reminded you. But by the time I reminded you those tokens were deleted because you never paid the rent.

It is such a can of worms, the examples of problems caused by rental is huge. Every APP that stores MDs for you has to be paid rental, what of all the business cases where data is lost because the rental was missed (due to some misconceptions or glitch)

We then need companies to explain it to people, businesses and where they are liable to pay rent or the customer is liable to pay rent.

What of the last will & testament that isn’t discovered till the personal papers are found detailing that the will is now stored on the SAFE network. But oh wait it was 31 days and it was deleted because no one knew rental was required.

2 Likes

You’re assuming it’s a rental model now… It’s not. It’s still pay once for forever data. The rental model is simply an add on to give people additional option. When they store they have to clearly choose it. And If they choose that then they should be able to handle the consequences of it.

Also with the time frame it can be for one year. So 10% for one year isn’t bad. And an extra 10% for first time initial storage cost.

For the user that already is possible, just remove the datamap link from your list (directory) of files. That is an effective way of deleting. If the file is private then no one can access the file ever.

3 Likes

Not continually (emphasis is mine). In my proposal, this is done only when a section is about to run out of space, which should never happen according to your reasoning about exponential growth.

2 Likes

Without the data map, how do you know which chunks to decrypt?

RSA /ECC etc. are probably not secure against quantum computing, but our chunks are, (AES internally + xor of previous chunk hash) quantum resistant. There are a few other things in the mix, but you get the idea.

[edit I should add the session packets etc. where you keep the data maps when encrypted with AES types are quantum resistant] I am looking to use a method of private encryption that is similar to EDDH capability but using quantum resistant protocols all the way, so quantum proof is well within our reach for all data]

16 Likes

I’m assuming you still need a data map, even if the data is small, otherwise multiple people would not be able to reference the same data. I’m sure someone who knows the details better can confirm though.

3 Likes

The SAFE Network’s inspiration from nature is a significant aspect that initially drew my attention to this project years ago. However, the ever-growing nature of the data in the network does seem to be an exception to that, which (to my understanding) runs counter to other patterns we observe in nature. I think this merits consideration, and that we should welcome critical eyes—from both supporters and detractors. (I think even detractors can add positive value to this community and the SAFE project.) As a daily reader and member of this forum for over three years, I ask these questions as a supporter:

Regarding the principles and sustainability of not deleting data: Physicists still debate whether ‘information’ is ever really lost on a fundamental cosmic scale (i.e. are all past states of the universe theoretically derivable from its present state). However, I think the information stored in an autonomous network like SAFE might be more analogous to the information stored in DNA. And, certainly, DNA adds and loses information over time and over generations. One of my main draws to SAFE (and one of my main interests in general) is digital preservation—so I understand the desire for permanently preserved information. Yet, if a central component of biology is its ability to adapt and trim unneeded information over time, shouldn’t we ponder if SAFE (which takes much of its inspiration from biology) is contrary to one of biology’s fundamental characteristics?

(tl;dr: Does the SAFE Network’s unidirectional growth run contrary to biological evolutionary principles?)

Regarding the economic dependence on ever-increasing storage: Many in this community seem to share concerns about the current sustainability of the global economy. Indeed, it seems that many people in general are becoming interested in cryptocurrencies and decentralized projects largely because of their worries about the global economy. Although data storage capacity is increasing quickly now, and has been for decades, we have also had a relatively stable global economy during that time. In the event of a major economic downturn, there would seem to be significant risk that the pace of technological development might also be adversely affected (especially technology like storage, which relies on physical resources and growing economies). If the SAFE Network is seen, at least partly, as a defense against global economic risks, doesn’t it seem risky to make it so dependent on future technological/economic growth (growth that so many in communities like ours seem to doubt)?

(tl;dr: Does the SAFE Network’s economic viability depend on the stability and growth of the global economy?)

The success of the SAFE Network is very important to me, as I believe it is to most members here. Yet I do have to admit that these are two concerns I’ve had for awhile. Naturally, I hope that my worries are either based on misunderstandings, or that these issues will be solvable.

7 Likes

Really hope SAFE bypasses ISPs. No reason in the future we have to pay a toll road to communicate.

Especially one that lobbys against free speech and spies on us and censors (non netrality) and steals our attention by promoting interruptions, and loses our data and make us pay to see ads that it is already profiting from through arbitrary caps and which insists that it must have arbitrary profits for its local monopolies but has incentives through its arbitrary ‘premiums’ that disincentivize reaching certain adequate thresholds and which tries to suppress the competition of competitors which must use public internet to reach customers in its region and which holds internet hostage by trying to force higher price bundles on people composed of redundant obsolete products.

SAFE should enable cord cutting. Interference mesh, line of site optical, LIFO all enable this. Maybe someday the main net will consist of just hand sets.

Step by step.

SAFE will be the network protocol. Others will have to use that and provide bandwidth outside of ISPs and have a model to sustain that. One day in the future we will have interfaces which are a set of entangled end points (maybe 100, maybe 1000) with the other ends being points across the globe. Now that is a distributed network.

3 Likes

Entangled points. Recent Advances in Post-Quantum Physics | Cosmos and History: The Journal of Natural and Social Philosophy

1 Like

This doesn’t exactly solve the problem… cos why would people bother deleting it, they may forget… UNLESS of course, you attach a refund when they delete their data proportional to the amount of time the data is on the network. Then it’ll kinda solve part of the concern and make the network potentially grow a bit faster.