Storing blockchains on SafeNetwork

The bitcoin blockchain is now nearing ~320GB of data. Sometimes there are just not enough peers seeding fast enough so downloading a snapshot of the block data can make the synchronization process a lot quicker. Torrentz as in p2p networks really get their fast speed due to many individuals seeding chuncks of files. In the same way we can now store blockchain data on the safenetwork and leverage the download speed of many nodes sharing these files. Once you download the blocks for the given snapshot your node can validate every block.

I also wonder if we could take it even further and allow a full node (except the personal data like wallet.dat) to be stored as ImmutableData and then mount this safe://directory as a network storage device (read-only) so we now no longer need to download this whole chain to our harddrive.
It will only require to get bits of this data to query transactions and find the given balance (UTXO’s).

Mount safe://blockchain/btc with symbolic link to ~/safe_btc_blocks and where ~/private_bitcoin_files is a folder that contains you private files like wallet.dat (to seperate private and public files)

My goal would be to be able to do something like this (on linux):
$ ./bitcoin-qt -blocksdir=~/safe_btc_blocks -datadir=~/private_bitcoin_files

Does anybody know if this would be technically possible? This could bring full nodes to even mobile devices as they only need to query certain blocks and then have remaining data stored locally.

And since these blocks would be public ImmutableData many people can also verify them, maybe sign or (‘like’) it to show it is authentic and valid. And because it is Immutable people can also trust that it cannot change anymore and is therefor SAFE to use :wink:

17 Likes

I suggested 2 weeks ago plugins.In short, Elders can run full bitcoin nodes on their local network emebeded as plugin. Network can ask elders tu call bitcoin plugin, that can run on different machine owned by elder. Elders can or can not support such plugin, it is voluntary and each plugin call is paid with safecoin. Client can ask neteork to run bitcoin plugin directly via native neteork api. Network picks randomly 8 elders and asks them tu run that call. All 8 elders return some result. And network signs those results with netwrok owned private key and reward elders with coins paid by client. Client then stores the result as public data to safenetwork. Everyone can check that that public data was calculated by bitcoin plugin call on 8 elder machines thanks to the network signature. So that data can be trusted. If all 8 results are the same, we can trust them. If there is some conflict, we can trust it of mayourity returned the same result. This way we can have realtime bitcoin blockchain copy. And we can also use those plugins for publishing of transactions. This way we can implement any blockchain copy and any logic on all the blockchains in the world. We can implement crosschain atomic swaps, etc. Basically anything that any other blockchain software does. We can clone ETH and make it faster…

6 Likes

I’ve been thinking this myself. I think the tricky part will be adding blocks/transactions which I haven’t given any thought but will be soluble.

I’m not sure what the best way will turn out to be. There are already Blockchain wallets that operate a bit like this. I think Electrum is one, Edge wallet might be another. Or there might be a way of using the particular features of Safe data types.

3 Likes

This would be awesome because with this network consensus you can now easily host many blockchains on SafeNetwork and cross-chain atomic swaps become quite easy.

3 Likes

It will be interesting to see how the regular bitcoin app works over the SNFS mount. I don’t know anything about the characteristics of what the app does at start up or how it scans the blockchain data for specific entries, etc. It may be fine or it may be dog slow.

I suspect that a client tweaked to work directly with the safe network may resolve some issues that may crop up though.

It will be interesting to do some tests when we’re up and running.

There aren’t just the blocks, there is the block indexes … I’m no expert, but my guess is the indexes are specific to the wallet transactions … but if indexes can be stored on SN for the specific user and if the BTC client itself can be WASM’d, then the whole shebang can be run from SN and secured by SN - portable BTC wallet with nothing to carry around.

Not sure if how WASM’d clients can communicate internal to SN though - would require some client adaptation to SN.

I’m probably just dreamin’ here though. :wink:

It just occurred to me that one nice thing about storing blocks as a public blob is that everyone who tries to do it will get the same content hash and thus address. So it will be very efficient to crowd source the actual data. Then there can be multiple independent indexes for however one wants to interact with it.

7 Likes

First off, I disagree with the premise that blocks cannot be served quickly by the Bitcoin network. Syncing the blockchain is also dependent on the hardware your node is running on, and that is the bottleneck in my experience.

I think using block data stored on SAFE however would be a good way of pruning your node. You would run your full node as usual. Then once blocks have reached a depth that you are comfortable with you could check that the same data exists on SAFE and then replace your local data with a symlink to the data on your mounted SAFE drive.

If you just blindly followed a chain stored on SAFE I don’t think you would be getting all of the benefits of a full node. You need to be connected to the Bitcoin network and receiving blocks from different peers to ensure that you are on the longest chain. Blindly following a chain on SAFE would be like eclipse attacking yourself (just trusting what one peer is telling you the chain is).

1 Like

This sounds unbelievably exciting and could be an interesting marketing ‘hook’ for the roll-out comms to generate interest about possibilities and future potential.

5 Likes

In the plugin mode, you can have access to all the results returned by plugins. So if there are more histories, all of them can be stored and it is up to client code to pick the one he wants. If calling of 8 parralel plugins is not enough, it could be configured to run on 100s of nodes in parallel. And the result of all of them can be stored. Plugin call is paid service, so client can specify number of plugins to run at once and pay each of them.

3 Likes

Regarding to plugins, do you think this is similar to oracles? Like nodes on the linkchain that provide smart-contracts with certain data?

I think oracles could be the first step for the SafeNetwork to slowly move to a fully decentralized computational framework and I’d say writing AWS Lambda like code would be the best option. Small simple functions that have a predefined computational timeout, the user pays upfront and depending how long it takes is returned the remaining unused credit.

By serving many of these small functions to different nodes you obscure the larger process so nodes are less likely to figure out sensitive data (the goal anyways is to allow nodes to do all these calculations in a encrypted manner so this would eventually not be an issue).
Small code also has benefit that many machines can easily take part in the job, the micro services also scale incredibly well horizontally.

Really something to look into, decentralized microservices/computation for SAFE.

3 Likes

I suggested cardano stake pools run on SAFE.

They are 90% “decentralized” apparently but what they don’t tell you is many of the stake pool operators use AWS…

Just my two cents on what else can be run on SAFE.

6 Likes

If one ran, say a bitcoin full node and wanted to prune, it would be trivial to choose to trust SAFE… imagine walking through the database, using a SAFE-ffi function to hash and get the public blob xorurl for each block. You could easily build your own index directly to the SAFE network. You could then compare it to a specific version of an appendable index on SAFE, and then permanently trust the index/version, and if you wanted to you could store/verify a UXTO set at that height as a public immutable blob that you could always trust. SAFE is an unbelievably efficient method to archive blockchains.

9 Likes

I think there is a big difference on how it is run. My proposal of plugins is technology agnostic. This means, they can run any code, on any platform, on any computer not necesarry on the same machine where a vault is running. Vault owner(I proposed only Elders, since they can be easily punished for bad behavior.) can configure their vault with a pair, Plugin_ID, host:port, and this means, any client can ask network to run a plugin_id function, and vault, when picked by network just redirects that request to configured host:url. On that host:URL there can run plugin, that listens to such calls, does the calculation and returns result. This way, there is 0 code running on the vault itselt. Vualt owner can run plugins on other machines at home, or even somewhere remotely. Vault Owner is responsible for quality of that calculation. This way, it is super simple to bring external data into safenetwork. It is super easy to write such plugin, that returns back Blockchain data and publishes transactions. It is easy to modify and add another plugins for every blockchain project. Such plugins can be used to grab realtime stocks data, weather data, etc. It is perfect for doeanload public data. And since this code is run on multiple vaults at once, and networks signs all those results with network owned key, than client that requested that data can store it and publish it as public data on the safenetwork. So others can trust that data to be valid, thanks to the network signatures, that prooves that those data were calculated using paralel plugin call. So my plugin proposal is not about computation, it is about making very simple but robust way how to bring external data and worldwide services ad a native part of the network. This is not about some close computation, or programs running on vault machine. It is about bringing whole world data and services into safenetwork in a decentralized and trusted way. And the cool part is, that plugins are way easier to implement than any other computation solutions. Plugins does not run on vaults, does not endanger vault software. Plugins does not need safe network to care about content. Safe Netwoork just need to store list of active vaults supporting plugin_id, and call random X of them. And when all the results return, it just need to sign that calculation with the network owned key. That is super easy to implement. And users will call those plugins, pay for their computation and will publish that data themselft to the network. For example, you need working BTC chain, someone paid for first X blocks and stored them publicly. It is out of sync now, so you can make the remaining calls for the remaining blocks, pay for them and upload the results as public data. If you do not do that, someone else will do that. If nobody does that, than that data is not required, and nobody needs them. First one, that will need them, will start to grab them and upload them to the network. So it is even optimized by usage. Only those data will be grabbed, that people need and are willing to pay for storage and calculation.

3 Likes

Presumably, the data oracles would be hosting could just be stored natively on safe network by an external process. This data could be of any size, large or small. Clearly, blockchains can’t store large data on chain due to scaling issues, but that isn’t an issue for safe network.

E.g. The Met Office could write weather data directly to safe network, signing the data for authenticity. Same for stock market prices by Reuters, etc. Any other app can then read this data natively from the network, including smart contracts.

Ofc, in the early days, the source of data may be a proxy user until the big boys get involved. Maybe use multi sig to give more confidence. Maybe even have a plugin which pulls the data and asks other nodes to confirm and counter sign it too.

Having immutable data integrated natively from the lowest level on the network gives lots of options here.

Edit: To add, what would be really cool is not having to depend on centralised sources of truth at all. Maybe a network of folks taking weather data or observing price data, the publishing it, etc, would be feasible

3 Likes

If it can be stored on the Safe network, can it be stored on the storj network?

Storj is operating right now, so it might function as a temporal solution until the safe network comes online.

The blockchain is a database with indexing.

Yes it could be stored on Safe and I’ve suggested that as blocks are confirmed they could be added to the copy on Safe and the signatures of the nodes confirming the block can be added so that any invalid block is identified and ignored by any Apps.

Now the fundamental issue is speed. While Safe is fast a node with the blockchain stored locally on its disk will be able to access the blockchain and do lookups needed to verify balances much much faster than a node trying to use the copy on Safe. Plugin or no plugin, the issue is with lag across the wire for each lookup. 100mSec instead of sub-mSec for each read/lookup

But where the copy on Safe is good is for people wanting to build/rebuild their node. The download of the whole blockchain or portion would be able to max out most consumer internet connections.

5 Likes

@Antifragile ,

While I usually like what you write, I stopped reading the single gigantic para above because it is too hard on my eyes and too hard to follow . . could you edit the post and add some para breaks?

4 Likes

Maybe enter key was broken

1 Like

I write from my smartphone most of the time. So my texts here are full of misspellings and ugly formatting. I reject to come back to my older posts and format them when I am on PC:)

1 Like