Imbalance between PUT-s and GET-s can cause trouble?

Fergish, thanks for the link to the wiki, I will see what I can get there. Of course the subject is so important that one would think it deserves a coherent writeup in the form of an article.

Re the feasibility of pay for “keeping of 1Gb of data for 1 year” and its compatibility with privacy and anonymity my question still stands:

do you expect completely anonymous untraceable payments to be possible on SAFE?

1 Like

If this were to happen, then yes its a problem to the computing world. So many projects rely on future storage costs greatly reducing per GB as their data requirements grow. Many businesses will have budgeted for this and for it to stabilise will spell trouble for their profit margins. It may not be a collapse, but its not a collapse for SAFE either. There is threads that go into depth of why not.

But I can guarantee that unless there is some economy collapse that is not happening for many years.

But yes a stabilisation of storage costs per GB would cause some noticeable changes in the PUT pricing to account for GET costs needing to rise.

Do not view it as PONZI, but as averaging. The economics say that for every amount of data stored that the costs can be averaged over its so-called lifetime. So for some data its well under priced and for some way over priced. The dynamic nature of the pricing allows for adjustments (major & minor) to account for changes to that averaging. Its like the club working out that for every life membership sold it will cost the club x$ to service it. Some of those members get many times the worth and others rarely use it.

As for SAFE storing FOREVER, my guess (really a certainty) is that a new better SAFE will be built long before then. I am no spring chicken now, but before I die I am sure that SAFE will either be rebuilt or evolve to suit the conditions that are present then.

I will try, but I can only do the same searches you can using the search button.

At this time there is not a way to do this and keep the current model for storage. The network does not time or know who own the chunks. The datamaps that a user has tells the client s/w which chunks to collect and reconstructs the file. The network does not do the reconstruction or connect chunks together, so timed storage according to a users piad amount is not there. If it were built in then it links data to a user and reduces anonymity considerably.

That is the planned model. The weak points is exchanging to fiat and when purchasing and info exchanged then.

2 Likes

BTW, here’s the equation:

let V(t) be the amount of data stored at time t
let CV(t) be the cost to all the farmers per day
let V’(t) be the first derivative of t - this is how much data you store per day
let P
V’(t) be how much money external users pay per day to the network (no internal users in this model)

Then we have

P*V'(t) = C*V(t)

Okay shame on me, I’ve forgotten how to solve this :slight_smile:

1 Like

There are all sorts of scenarios and even the simple ones aren’t that well understood (i.e. whether something would cause a drop or raise in price of PUT).

If number of PUTs drops, the cost of a PUT should drop too.
But you also assumed that demand for reading the data would go down (because of the crisis) but we can’t know whether that’s the case.

People (farmers or users?) leave: farmers leave, the cost of PUTs goes up, more farmers join in. That’s the theory. But maybe also: farmers leave, the cost of PUTs goes up more that people care to pay, so the use of SAFE drops. (That should make the cost of PUTs drop, but we don’t know which would drop faster, the cost of PUTs or the number of farmers, or users, so I think we won’t know until the network is live with real coins and real files).

1 Like

Cost of memory plummets by more than half every year. Maybe this helps.

EDIT: sorry if people already mentioned this, I didn’t read all the posts here yet -disclaimer-

Here are 2 topics that I found quickly, I am sure there was another but cannot find it at the moment

Can I ask where you got the equation from?

BTW you integrate (over a day) to go from a slice in time to a daily figure

That equation assumes a lot (even when we fix it up). Firstly it assumes constant rates because that is what those equations equate. 1st modification has to account for flux in PUTs due to cost changes. 2nd modification has to account for reduction in costs because of drive costs reducing for the foreseeable future. 3rd modification has to account for the rising costs in electricity (above whatever is used anyhow == often $0.00) 4th modification is the cost to buy coins to use because often PUTTers are not (large enough) farmers. The 4th is important because humans are placing a worth on costs that the network translates in to cost for PUTTing. [5th] … [6th] …

The SAFE economics is very dynamic and not reliant on constant rate maths.

Lets look at the case where people are PUTting lots of data day after day after day.
initially the PUT prices are reasonable
after a little while the price starts to increase a little
then more and more.
farmers get sick of little GETs being done and their vaults filling up
farmers start leaving
SO what happens???
GET rewards increase until enough farmers again
and then price of PUT even out or reduce.

Remember that farmers earn the most when they are using spare capacity (== zero costs), so price increases do not affect a lot of the farmers.

2 Likes

Happy for you to propose a fix for it. :wink:

Well the equation is smth I just made up from my model: cost to store 1Gb of data per day is constant - this is C, price paid for 1Gb stored by external users is constant - this is P, the network is accumulating no money inside it, nor do farmers spend their own money - all money paid by external clients immediately goes to system up keeping. There are no internal clients in the model.

Yes that’s a lot of assumptions yet I find it useful to assess long-term stability.

I believe the solution has the form of

V(t) = a * exp( b * t )

where a and b are constants. To me that says: in a world where the cost to farmers per 1Gb per 1 year is constant the system will be able to maintain a constant price per 1Gb stored to external clients only under the condition of an exponential growth of data stored and revenue collected per day.

This is not the case. It is very much variable

Variable too. Electricity costs rising for instance.
But often it is zero because a lot of farmers will be storing their vaults on spare capacity that is powered on anyhow and bandwidth is usually paid for anyhow (as long as they stay below limits – ie stop vault at times when needed)

Then whose money do they spend when they buy their drives. Or are you saying that C is zero because they spend no money to store those 1GBs of data, because they use spare capacity.

It seems that you have not factored in the farmers who use spare capacity and it costs them nothing (no incremental costs over not running) to run vaults. The network is designed for this usage and really favors this because GET rewards are not expected to be profitable for those who need to spend (significantly) extra $$$ to run vaults.

1 Like

It’s a very big, coordinated subject. There is a lot written on it. The wiki is not complete by any means, but you can also wade into the whitepapers , or the educational videos at safenetwork.org

Yes, safecoin can make anonymous transactions. The point is irrelevant to what you are proposing.

2 Likes
  • indeed SAFE can grown to a certain size on free capacity
  • beyond that people will have to buy extra hardware

Besides there may be an opporunity cost here: people could choose join FreeNET instead or use that space for BitTorrent files.

…and the question which I wanted to answer is this: under the assumption of constant cost of Gb/year to farmers can the system maintain a constant cost for consumers?

LIkewise we can ask: again under the assumption of constant cost of keeping Gb/year to farmers how would the cost to clients behave should the amount of data put into the system daily remain constant?

V'(t) = K                      amount of data added per day
P(t) * V'(t) = C * V        money paid by external clients at this moment in time

so solving this

P(t) * K = C * K * t
P(t) = C * t

ok - so in this case the price grows in a linear manner

What I mean is that in my model the average costs collectively incurred by the farmers are exactly compensated by the revenue in dollars that the network collectively earns.

[quote=“neo, post:29, topic:5544, full:true”]

If this were to happen, then yes its a problem to the computing world … But I can guarantee that unless there is some economy collapse that is not happening for many years[/quote]

Indeed. But who can guarantee the said economy collapse is not waiting around the corner? In my humble opinion the goal of building an alternative internet which can only thrive in days of boom is not noble enough. I think that a more noble goal would be to build an alternative internet which would be able to live in the days of bust as well.

In fact in days of bust there may be even more need for it. The workers will need to collaborate to protect their rights. Small time earners will need every chance to pick up an extra penny. Some of the services available today for free may become unavailable because of the big Co providing them may run out of cash.

I would like to thank you again, Fergish, for useful links. However I can not but help being puzzled at your lastest statement. How can anonymous transactions be irrelevant to “pay per year per Gb” topic? Suppose we upload a big file like a movie and associate a wallet with that movie. Then I anonymously make a payment into that wallet. Does that not mean that now this movie can be kept in SAFE for another year without it being linked to my identity? And next year I pay into that wallet again - and the movie will stay online for another year?

All of your formulas figure that economics are the driving force for farming.

I don’t think it will be…

People will want to use SAFE because they want the security from it. They want their files to be backed up. They want their files to be in a place where hackers cannot get to them. They want confidence that they can reach their files from anywhere.

The cost of doing this is Farming. You can either buy a RAID array for you PC at home, Set it up, Encrypt it, Install firewalls. Get a VPN to several remote sites and create additional copies - Pay somebody to monitor and repair and maintain those remote computers — OR you can use SAFE and accomplish the same thing without any of the messy setup and maintainece. Farming is the price you pay for all of that benefit.

And SAFEcoin is what balances it out if you need more than you provide or if you provide more than you need.

SAFEcoin is there to prevent abuse - not to make massive profit centers possible. It’s a Cooperative – It should break even in the end.

1 Like

[quote=“jreighley, post:40, topic:5544, full:true”]People will want to use SAFE because they want the security from it. They want their files to be backed up. They want their files to be in a place where hackers cannot get to them. They want confidence that they can reach their files from anywhere … It’s a Cooperative – It should break even in the end.
[/quote]
Hey mate, you’re reading my mind. That is exactly how I want it to be. But for it to become reality I feel it necessary to consider all possible cases of system abuse first.

Indeed my models above are for relationship with external clients. If I manage to build a model for internal clients I’ll post it here as well.

Right now I can think of one scenario of internal abuse:

  • I’m a mad photographer
  • I farm, I abide by the rules, do anything it takes to earn an amout of safecoins
  • once I have them, I immediately use them to upload lots of rubbish into the network
  • then destroy my vault and never farm again

I think there may be a problem here. Whatever good I did for the network it was for a limited time. However the oblication that the network has taken - to keep my files is forever… There is an imbalance here.

My gut feeling is that it needs to be apples for apples - you keep my files for a year - I keep yours. Making obligations accross time - like I keep a file this year you keep mine next year already looks a bit dangerous to me (intuitively). And an eternal obligation looks feels like a recipe for trouble.

Today I’ve srated putting into formulas what exactly feels wrong to me about such an imbalance.

You turn off you computer and everyone else gets paid just a bit more to make up for it. It isn’t like storing 3 or 4 more chunks is imposing a massive burden on them - their machines are already on – They are already connected. They just get paid more for what would have been idle time anyway.

Then the next rubbish dropper has to pay higher rates.

The cost of storage ought to be what it will cost the network to store your data forever. I don’t think that needs to be too cheap. I am sure there is a formula that will work out just fine and dandy… The price need to be high enough to prevent abuse, but what you are suggesting isn’t really abuse. If you earned enough to store it, it’s stored.

Choosing to farm is like entering a lottery pool. If you play you will win, if you don’t you won’t…

2 Likes

I read somewhere that there will be a delete feature, and/or limited time exposure.

In private storage, it would make sense to have delete feature.

Would it feasible to delete your data, and collect partition of the coin back? Like in valve dota 2, you could delete your cosmic item. In return, you may get nothing, treasure, item, or rares. A lot of replicates oversaturate the market, it causes users to delete them, and get a return item for none, equal or greater. This would be awesome idea for safenet. But again this might lead to abuse.

As I understand it, No.

The network doesn’t know what files are yours. It doesn’t know if only one person stored it or 100,000 people stored it. So if you want it deleted, It cannot delete it, lest it deletes somebody else’s copy as well.

The same file will have the same chunks and the same hashes and therefore they would be routed to the same vaults.

I think some of the structured data RFC discuss deletion of structured data – but not the typical files.

I do believe there has been some discussion of archive vaults that will hold the data that is rarely used… But it’s all discussion, I think.

2 Likes

Working with constants is not SAFE economics. So any answer calculated is not representative. And when you set the conditions for an outcome then its no surprise that said outcome is calculated.

I think you already know the answer, as you’ve said it already.

There will always be external conditions that will make SAFE fail. Picking an obvious one like economic collapse which can in of itself destroy the ability to use all higher tech for the masses and limit it to the (new?) elite. In this case SAFE fails because it relies on the masses who now cannot supply the resources. It is very difficult to design a system for the masses using tech that will survive in its optimal state. If the internet connectivity is still usable by the masses after such events then SAFE will likely survive and work satisfactory. PUT cost would be cheap if enough people can be farmers, and expensive if not. The rich then are storing data and the farmers gain some much needed income. Who knows though, its guess work because there is no definition of how bad the collapse you suggest.

Unfortunately if you make your own equations of how your model works then its going to be contrary to SAFE models and up to now it was unclear that you were proposing a completely separate model and asking what-if.

1 Like

No.

It works like this: Think of your data, once encrypted and uploaded, not really being data anymore (when is data not data…). All it is are chunks of encrypted something.

Those chunks can be de-duplicated so one chunk in your whole data file might be a chunk in someone else’s different data file. The network does not and cannot tell which pieces were de-duped. Permanently removing one of your files from the network is therefore impossible to implement securely.

In this case, without being able to tell the network to “drop” a chunk or chunks, there’s no way to “help” the network by “freeing up space” and therefore no way to deserve any safecoins.

However there is the ability to delete a file. But not in the sense that you’re thinking of.

Again, when you upload your data to the network, it’s no longer data, it’s randomness. It’s noise. It’s chunks. Your data is no longer data.

For private data, the only thing that exists that can pull all of those random chunks from the network and piece them together directly is the info in your datamap. If you delete the entry from your datamap poof! your data is gone…in the sense that it’s irretrievable.

Notice that the data is still being stored on the network, but it cannot be accessed as the whole file. This is what it means to “delete” your data. Nothing more, nothing less. (IIRC)

Note1: Public data is a bit more complicated as someone could potentially store the chunk/file info in their datamap for access later. In that sense, you may have deleted the data so that you cannot access it, but they still can. EDIT: Private Shares have this same attribute.

Note2: The only way to re-obtain that data once deleted is to re-submit it to the network. There is the chance of possibly doing this offline and computing the individual data chunks and reconstructing an entry to go in your datamap. In this case you would already have the data locally.

3 Likes

It’s not the “pay” part that is the problem. It’s the “per year per Gb” part. As I said, it requires a better understanding of how data is stored and retrieved on the network to see why what you are proposing is unworkable.

When you store a file to the network, it is self-encrypted and sent out in “meaningless” (without the data map) chunks. Monitoring account managers check that you have paid for resources to store the file and authorize storage. Then those chunks go out and are stored with NO association or link to you or your account. To maintain any such link would be probably several layers of complexity and burden on the system, and expose a bunch of security problems.

Anyone can retrieve any data stored on request. The point is that without the data map, no one knows the address of the data or the means to put it together with the other file chunks, or decrypt the file.

The cost to store data will not be super high, but it will be enough to discourage spam.

Believe me, I have a sense that there are other questions that could be asked about how this all balances and works out. But the time storage limit isn’t practical as a means of averting the problem you’re looking at. It, in itself, would add a potentially bigger burden and lose the security-simplicity of the encryption/storage model.

Hope this helps see what I’m talking about.

3 Likes