How transactions (could) take place using Decorum

Let’s estimate what the average size of a transaction will be:

name: 32 bytes
signature: 64 bytes
previous_name: 32 bytes
output pubkeys: 32 bytes each
owner: 32 bytes

Let’s assume the average amount of outputs in a transaction is 3. Then the above adds up to 256 bytes. Now there are some smaller fields that I left out, but they won’t add up to much. Let’s assume for convenience that the average transaction will be 300 bytes.

If there’s a history of 10K transaction to validate, that’s 3 MB of data. To prevent having to download them one by one which would be incredibly slow, the sender of the transaction could send a list of all addresses of the transaction history rather than just the address of the last one. That list would in this case be 320 KB. Now the recipient can start downloading them in parallel and then verify them.

There’d be 10K signatures to verify. I read in a paper that a 2010 Intel Xeon quadcore CPU that’s now a little over a hundred dollars can verify 71K Ed25519 signatures per second. Even if you have an outdated desktop CPU, verifying those 10K signatures likely won’t take longer than a second, assuming optimised software.

So I’m not worried about this, even with hundreds of thousands or even millions of transactions to verify in the very distant future (when computing will be even faster), I don’t think verification will need to take longer than 5-10 seconds. Computing speed increases may very well outpace transaction history growth.

6 Likes