20 years is a completely inadequate design lifetime for something that’s trying to be the basis of “new Internet”. It’s “senseless”, to use your word.
Out here in the real Internet, we are still using IPv4 (in active use for over 35 years). We may just now be managing to really move to the “new” IPv6 (whose specification reached essentially final form over 20 years ago). We’re able to do that mostly because IPv6 doesn’t make any breaking changes in the services offered by the network layer… which would not be true for relaxing SAFE’s “eternity” assumption.
At layers above, we have TCP (35 years again, with some backward compatible tweaks up to maybe 25 years ago), and HTTP (in use in its present form for over 20 years; almost 30 years if you count HTTP 1.0). DNS has been in operation for about 30 years as well. None of them are going away soon. If they do go away, it will again be because other protocols offer compatible drop in replacements that don’t violate application assumptions. Immutability is an example of such an assumption.
Fundamental protocols live a long time… if they succeed. It’s amazingly shortsighted to think in terms of 20 years or even 40 years. Everything does not just magically get “migrated to a new system”.
For example, the existing Internet applications are not all going to migrate to SAFE in the next 10 years, if they ever do. IP, probably even IPv4, will absolutely have a 60 year widespread operational life span. Maybe longer. If SAFE gets where it supposedly wants to be, it will have a similar life span.
Also, MaidSafe was apparently founded in 2006. 14 years is a very long time to spend designing a protocol suite if you only expect it to have a 20 year working life.
Anyway, the whole point of the “a bit on every atom” reductio ad absurdium is that it is not achievable, ever, period, with any technology, magnetic, SSD, DNA, nanorods, fairy dust, or otherwise. The point is to show that even if you’re allowed absolutely any new technology that could ever be built, and even if you’re allowed technology that could never be built, you will still stall out in the reasonably foreseeable future. The particular number of years doesn’t matter much.
60 years happens to be what I came up with when I started with “a bit on every atom”, decided to apply that to every atom in a solid volume, and calculated how long it would take to get there at your stated rate. But I chose those assumptions because I didn’t want to leave any room at all for argument.
I could equally well have shared your reliance on the last 50 years or so, and taken the view that since every single storage technology so far has been fundamentally planar, you shouldn’t expect to stack layers more than say 1000 deep.
And the figure I was using (3x10^24 atoms in a 3ml volume) would allow for a “USB stick” made out of metallic hydrogen or something exotic like that. That’s why it’s higher than your number, and, yes, thanks, I do know how many atoms are in a USB stick. I could just as reasonably have assumed a less dense silicon lattice spacing.
Silicon spacing on 1000 planes of 3 square centimeters would have given you roughly 5.5x10^18 atoms to play with. Which would mean that you’d have run out of space in that USB stick in 30 years. Except, of course, that you still wouldn’t actually be able to achieve that anything close to that density, because it would leave no space for the stuff that manipulates the atoms and moves the data.
These hypotheticals depend on a whole bunch of ass-pulled numbers, and I got to 60 years by choosing to ass-pull numbers that gave you every possible benefit of the doubt. The only thing I refused to do was to assume that some kind of completely unexplained (and frankly implausible) technology will show up and solve everything.
In reality you will not have 60 years, because you will not be storing a bit on every atom in a solid volume. You will not have 40 years. You probably will not even have 30 years. You may not have 20 years. And you will deifinitely see costs go way up well before you hit the final wall.
1987 Large disk drive for a personal type computer was 5MB, 32 years later, 2019, (~2x10^6 predicted increase) we should have 10TB drives. I bought a 16TB drive last year.
I looked at it again, and the curve has been smoother than I thought. But the rate of improvement is still slowing down, especially on cost as opposed to pure drive size.
As for new technologies, since you seem to reject the technology-independent argument…
SSD is not at the beginning of its curve. SSD is just plain old VLSI integrated circuits (with die stacking). VLSI has most of its growth behind it. Flash is not going to get much denser before charges start tunneling out of cells. You’re not going to be able to stack chips much higher before they fry, or the interconnects become impossible, or they just plain fail mechanically. And there is no obvious better replacement.
DNA storage is, frankly, a gimmick. If you could get that to work, you could get other, better things to work. But those better things still wouldn’t get you to a bit on every atom, and neither, obviously, would DNA.
As for “controlling nuclei”, I can guess what paper you’re talking about: the “Nuclear Electric Resonance” one, right? If you think that’s going to lead to storing different bits on every atom, you really have no clue what it’s about. You might be able to put multiple bits on an atom… with millions of non-storing atoms around it to manipulate it and maintain its state. You won’t be able to put different bits on the next atom beside it, and the one beside that, and the one beside that, and read them back separately.
“Control a nucleus” in that headline means “line up the spin in the desired direction, while also lining up the spins of every atom for an enormous distance around it in the exact same direction”. It does not mean “separately set, maintain and read back separate distinguishable per-atom states, reliably, over any number of atoms”.
When you say “In 60 years we will be storing data inside of atoms, maybe 64 bits even sometime in the future, who knows in 60 years”, I can only read that to say that storage is going to keep getting denser and cheaper because you really, really want it to… not because you know any real way for that to happen. It’s magical thinking.
You have no idea how to actually do it, and no reason to believe it can be done, beyond a completely invalid extrapolation of an exponential curve. Exponentials always break down in physical systems, usually sooner rather than later.
The two concrete ideas you mentioned (DNA storage and the weird NMR business) show that you don’t begin understand the limitations of the work you’re relying on.
At least with SAFE there is only one copy of each unique dataset rather than potentially up to a billion copies of win7 and win 10 being backed up by millions to billions of people on a regular basis
People have their own copies of data because they need to process those data. Your local disk has copies of stuff that you need to access quickly. You can’t just keep stuff you use constantly out on a widely distributed network. Not if you want to actually get anything done.