ClientError: Did not receive sufficient ACK messages from Elders to be sure this cmd passed

From a vdash PoV, all looks well in the morning, nothing new.

However I did notice a few general errors


---+-------------------------------------------------------+---------------------------------------------------------------------------------------------------------------------------------------------|
| E | /fgfs/Scenery/Orthophotos/w020n20/w017n28/2678144.dds | <ClientError: Did not receive sufficient ACK messages from Elders to be sure this cmd (MsgId(9e2f..4936)) passed, expected: 7, received 6.> |
|---+-------------------------------------------------------+---------------------------------------------------------------------------------------------------------------------------------------------|
| E | /fgfs/Scenery/Orthophotos/w020n20/w017n28/2678145.dds | <ClientError: Did not receive sufficient ACK messages from Elders to be sure this cmd (MsgId(3ea4..0abd)) passed, expected: 7, received 6.> |
|---+-------------------------------------------------------+---------------------------------------------------------------------------------------------------------------------------------------------|
| E | /fgfs/Scenery/Orthophotos/w020n20/w017n28/2678146.dds | <ClientError: Did not receive sufficient ACK messages from Elders to be sure this cmd (MsgId(b153..41fa)) passed, expected: 7, received 6.> |
|---+-------------------------------------------------------+---------------------------------------------------------------------------------------------------------------------------------------------|
| + | /fgfs/Scenery/Orthophotos/w020n20/w017n28/2678147.dds | safe://hyryyyypwpgbunsizkktasinoh9phkfjkx5y7nahxpkm14ig793ysa4nkyc                                                                          |
|---+-------------------------------------------------------+---------------------------------------------------------------------------------------------------------------------------------------------|
| + | /fgfs/Scenery/Orthophotos/w020n20/w017n28/2678152.dds | safe://hyryyyyxeitf6dr99ymss5gafu15nqf6p9knhj9c6ph397q9eus6dw94qhr                                                                          |
|---+-------------------------------------------------------+---------------------------------------------------------------------------------------------------------------------------------------------|
| E | /fgfs/Scenery/Orthophotos/w020n20/w017n28/2678153.dds | <ClientError: Did not receive sufficient ACK messages from Elders to be sure this cmd (MsgId(ac4c..9f95)) passed, expected: 7, received 6.> |
|---+-------------------------------------------------------+---------------------------------------------------------------------------------------------------------------------------------------------|

With 40 nodes and 14 of them elders I was not expecting to see these errors

willie@gagarin:~/projects/maidsafe/DBC-testing$ safe networks sections
Network sections information for default network:
Read from: /home/willie/.safe/network_contacts/default

Genesis Key: PublicKey(10dc..8228)

Sections:

Prefix '0'
----------------------------------
Section key: PublicKey(0302..a5f2)
Section keys chain: [(PublicKey(10dc..8228), 18446744073709551615), (PublicKey(057f..e8b0), 4), (PublicKey(073b..066d), 3), (PublicKey(05d7..ab05), 7), (PublicKey(092c..355a), 2), (PublicKey(036e..f83e), 0), (PublicKey(0302..a5f2), 1), (PublicKey(15cc..90d5), 5)]

Elders:
| XorName  | Age | Address         |
| 0c1e0e.. |   5 | 127.0.0.1:55003 |
| 0c2894.. |   5 | 127.0.0.1:48143 |
| 20b3c0.. |   5 | 127.0.0.1:54053 |
| 20d6e4.. |   5 | 127.0.0.1:44822 |
| 3bd7c3.. |   5 | 127.0.0.1:52529 |
| 6c3bf3.. |   5 | 127.0.0.1:48474 |
| 7e84e9.. |   5 | 127.0.0.1:59375 |

Prefix '1'
----------------------------------
Section key: PublicKey(1128..ab2c)
Section keys chain: [(PublicKey(10dc..8228), 18446744073709551615), (PublicKey(057f..e8b0), 6), (PublicKey(09a8..9293), 1), (PublicKey(073b..066d), 4), (PublicKey(05d7..ab05), 8), (PublicKey(1128..ab2c), 2), (PublicKey(092c..355a), 3), (PublicKey(036e..f83e), 0), (PublicKey(15cc..90d5), 7)]

Elders:
| XorName  | Age | Address         |
| 8182ab.. |   5 | 127.0.0.1:46038 |
| 82d738.. |   5 | 127.0.0.1:37871 |
| 848e35.. |   5 | 127.0.0.1:33732 |
| 9bb92a.. |   5 | 127.0.0.1:43723 |
| e3609d.. |   5 | 127.0.0.1:57635 |
| b2eb7c.. |   6 | 127.0.0.1:60490 |
| 9666b1.. | 255 | 127.0.0.1:50163 |

I have full trace logs, do you want these zipped up and sent ? @joshuef @chriso

1 Like

I’d wager, if we look at the shortfalling msg ids, that we probably respond, but perhaps a bit late (ie, the client has already bailed at that point).

That timing issue is something we’re still digging into.

But shoot em over aye, I’ll give it a :duck: :duck: :+1:

(i assume this was from main @Southside ?)

1 Like

Yes I pulled and built from main about 20:00 UTC last night.
I’ll zip up the logs, DM me where you want them sent to.

is this equivalent of copying the safe and sn_node in the correct dir and running safe node run-baby-fleming --nodes 40?

1 Like

I think so - because I was dubious of whether safe node run-baby-fleming was working properly on the latest main, I pulled the source and built, then ran it via cargo run.
It appears to do exactly the same thing as safe node run-baby-fleming --nodes 40 but you need one of the devs to confirm that explicitly. There may be subtleties…

2 Likes

I can’t say I ever use safe node run-baby-fleming myself these days, so I’m not sure of what differences there may be although I believe it mapped to the testnet bin more or less…

This is essentially how i start my testnets :+1:

    NODE_COUNT=40 RUST_LOG=sn_node=trace cargo run --release --bin testnet
4 Likes

had a scan of your logs. You were hitting this @Southside : fix(node): prevent endlessly forwarding client msg if we're data holder by joshuef · Pull Request #2086 · maidsafe/safe_network · GitHub

thankfully already sorted and in main.

3 Likes

@neo please would you clean up by moving the non vdash replies (from here down) to a new topic. Thanks :pray:t3:

Thanks @JPL! Shhh don’t wake Rob :zzz:

3 Likes

This topic was automatically closed after 54 days. New replies are no longer allowed.