[Status: offline] SAFE Network Testnet - Vaults from home with IGD - Iteration 2

Good to know! I think in the previous releases it worked without opening a new console, so I guess that’s why I was confused. I use Win 10 Pro V 1903 OS build 18362.959 PowerShell Version 5.1.18362.752

Hope that helps.

3 Likes

congratulations on so much hard work done - with success!!

4 Likes

I think we should perhaps indicate this in the cli screen or every operation perhaps. Great feedback though from the community as ever.

7 Likes

And possibly kill your vault, not yet though :smiley: :smiley:

2 Likes

Yes, we can probably catch any AcccessDenied error from CLI and give the hint about authorising the app.

9 Likes

I didn’t try running a vault since my router didn’t work on the last test. But I did try some file upload and download tests.

Time to upload various file sizes (in seconds)

Test 1K 900K 5M
1 3.344 11.404 36.414
2 3.437 72.426 44.263
3 3.593 11.846 42.984
4 3.558 11.573 43.234
5 3.304 11.781 69.693

Time to download (only 1 test of each size)

1K: 2.575s
900K: 12.887s
5M: 57.364s


The AccessDenied error would benefit a lot from a hint. This is the error:

$  safe files put /tmp/1k.dat
[2020-07-16T23:47:28Z ERROR safe] safe-cli error: [Error] NetDataError - Failed to store Public Sequence data: Data error -> Access denied - CoreError::DataError -> AccessDenied

A hint such as Try logging in with "safe auth login --self-auth" would be really helpful as a suggestion for how to solve this problem.


Help could do with some extra info, I really struggled to work out how to upload files.

This command would benefit from listing the --self-auth flag.

$  safe auth login help
error: Found argument 'help' which wasn't expected, or isn't valid in this context

USAGE:
    safe auth login [FLAGS] [OPTIONS]

For more information try --help

This line in the help would benefit from mentioning the --self-auth flag

$  safe auth help
...
login          Send request to a remote Authenticator daemon to log in to a SAFE account
...
8 Likes

I am guessing I jut did not notice before but my chunks are showing up as types: image, html/text or binary. Has this always been the case?

2 Likes

Your script worked perfectly. This is my first participation in a test net! :slight_smile: Thanks! So cool!

5 Likes

Not yet. We got the js libs built last night, so I’m updating various browser bits and trying to fix some broken CI this morning, then hopefully we’re good there :+1:

5 Likes

Honestly not sure. I’d guess eg small images (<1meg) are one chunk so you have the whole file and your comp can infer even w/o the extension there.

I know chunk obfuscation is not in place yet, so this isn’t entirely unexpected. But it will not be so for release!

2 Likes

UPDATE:

I’d like to thank everyone for their active participation in this testnet. Around 06:00 GMT the vault processes ran out of memory and were killed by the operating system. We will look into this and start another testnet in the near future.

19 Likes

This did the trick, Maidsafe just spoiled us with Snapp, I use to log into the CLI without even thinking about it. :sweat_smile:

5 Likes

Gives us a target to confirm now and that is any growable arrays etc. in code that must not grow indefinitely. As we get parsec out this should be much simpler to confirm, but we can start on vaults right now. Nice one folks

13 Likes

That’s strange. I am feeling that I have already experienced this situation. This happens to me from time to time.

5 Likes

Is this parsec problem?

1 Like

No I don’t think so but parsec is a mem leak for sure. We have it contained but still it is a lump of mem we could use. It is much more likely a small issue we have in parts of the code that cache or hold data for a period or list of values. Both are wrong but you can make them more right by using duration and list length, but we should not have unbounded mem containers at all where possible. There is a design pattern that stops this and we recently and currently are looking to cement that. Basically nodes should use very little ram and data to be held has to be on disk where possible. So a little housekeeping for us, but the great news is we got pretty much all functions and features working here. So all in all this is great as the functional design seems right on track, just some practical data handling tidy up and that’s not that hard at all and generally will mean even less code and much less us humans imagining we know how long or how many things we can keep in memory, the key is as close to zero as possible works, anything further than that fails. So much more message passing rather then do everything nodes is the key.

It’s just decentralised networking and getting us away from code that works on servers with ulimited ram and cpu (clusters). Nodes must be small, resource light and with extreme efficiency. All in all a great place to be in for us all.

15 Likes

Déjà vu for sure! :smile:

5 Likes

Can anyone confirm @Southside’s whereabouts at the time.

17 Likes

:rofl::rofl::rofl::rofl::rofl::rofl:

2 Likes

UPDATE:

We have been looking into the issue and trying to identify the cause. From our observations, it doesn’t look like there was a sudden spike in memory that caused the OOM error. So it is most likely that certain components of the testnet is using more memory than it should. To help us identify this we will be doing memory profiling on the Vault processes and we would like you all to join network once again. But the goal this time is to take it down. :smile:

Instructions to join are exactly the same (see OP). Do reach out if you have any questions.

Thanks in advance!

13 Likes