SAFE Network Dev Update - March 26, 2020

@bochaco @lionel.faber @StephenC

What would be of most use to you today?

  • Further stress testing - large runs , >200 of modest test size using random data
  • Pushing the edge with 3GB+ files
  • Refining the test script to make it more user-friendly

or something else?

4 Likes

Hi @Southside

I just looked at @davidpbrownā€™s log and it seems like thereā€™s a self encryption issue there, so pushing it past 3 GB might always return the same issue so that can be skipped (for now :wink:)

If you have the time, it would be nice to see how the vault responds to larger runs with more iterations :slight_smile:
Thanks for this!

4 Likes

Here is 1000 runs of 50kb done last night in csv format.
Still need to convert the last column from text to numbers but I bet your sed/python/whatever skills are better than mine.

https://pastebin.com/LkbsM5e9

THis is the environment

START

Fri 27 Mar 01:12:35 GMT 2020
CPU(s): 4
Model name: Intel(R) Coreā„¢ i5-2500K CPU @ 3.30GHz
16378512 K total memory
14648316 K total swap
DISTRIB_DESCRIPTION=ā€œUbuntu 18.04.4 LTSā€
Linux 4.15.0-91-generic x86_64

EDIT: @happybeing can you give this a try with VisLab, please?

3 Likes

Rename with additional suffix as .txt

1 Like

doh!!!
Sometimes I am really dense

Havenā€™t tried uploading in parallel yetā€¦ two terminals doing the safe put on large files enough to be necessarily running side by side.

1 Like

I tried a couple of days ago but I was uploading the same file and ran into the error you noticed about a file being uploaded twice. I just put it down to the Access Denied issue.

I can have a quick go at that now

Will do - what do the columns refer to? If you can update the script to include those as the first line that would be nice, for example:

Run,Size,Duration

Iā€™ll start with that.

Sorry

Run number, Data size, elapsed time in mins: seconds

I need to strip the mins - multiply by 60, lose the ā€˜:ā€™ and add to the seconds

time %e might do that as seconds directly??

1 Like

You mean instead of %E in

/usr/bin/time -p -o $LOG_FILE -a -f "\t%E "

Yesā€¦ man time suggests thatā€¦ though not in tcsh whatever that tc shell is

1 Like

Yes I had to be careful to use /usr/bin/ime and not the built in bash time

For this data I think best to play with Vega Voyager. First though:

  • Change .txt to .csv
  • Insert this as the first line:

    Run,Size,Duration

  • Remove any other stuff (e.g. your comment at the bottom)

I have Vega Voyager embedded in VisLab, but you may as well go straight to it at Voyager 2 and then:

  • Load
  • Paste or Upload Data
  • Browse
  • Choose your ā€œ.csvā€ file

Hey presto you have some best guess charts. Choose one you like by clicking the ā€˜Specifyā€™ icon at the top right of the mini-chart (hovering it will show ā€˜Specifyā€™). That promotes that chart (I chose the one on the left which is a histogram), and shows a new set of best guess charts based on that. And so on.

I havenā€™t played much with Voyager, but it is very impressive for pulling in some data and doing quick analysis. You can create a wide variety of charts using it (drag and drop fields on the left) or use the interactive UI to explore very fast.

Hope his helps. If we need something better Iā€™ll look at adding it into a build of VisLab.

2 Likes

Excellent thank you.
Iā€™m juggling 3 things at once here, going to stop and have a nice relaxing glass of Turki chai and get tore in again in a few mins

1 Like

Breaks are good. I tend to go stand on the prow of my boat and survey my lands (and waters). :wink:

2 Likes

You dont have a prow, you just have a slightly less blunt end. :laughing:

BTW any more swans to be seen now that Charlie the Peado Protector and Virus Spreader has scuttledd off to his bolthole in the Highlands?

Running the same command in different terminals - and different dirs shows one fast result and medium CPU usage and one very slow with much higher CPU usage

1 Like

Nice one - that seems to work :slight_smile:

removed this post as it added no value!

3 Likes