This is now a wiki - please add your own thoughts and correct any mistakes I may have made.
So - how can we best prepare for Testnet v5?
Off the top of my head…
Remember you can assist with testing without necessarily running a node immediately or even at all
Have some data prepared that you can start uploading immediately T5 goes live. Some simple single text files and images that can easily be downloaded to confirm functionality and build confidence in both the PUT and GET commands.
Also have some larger folders and sub-folders that can be uploaded, so that we can practise using the full range of the safe files sub-commands
Further to the above, if you have websites that you built for earlier test networks, try uploading these again. This will let us get more confidence with the safe nrs sub-commands as well. I note also that GitHub - maidsafe/sn_nodejs has been getting some love today… EDIT: lower priority cos the browser wont be ready for T5
We have a few days before T5 will land. TIme that can be used to build your own sites for the SAFE Network. Ignore - the browser wont’be ready in time
safe://yvette was an early directory of many resources on SAFE back in the day - Hopefully this can be revived/refreshed
There was a great single page site that simply said “hello World” in many different languages - Be nice to see that reincarnated too.
Dig out your Pis, Odroids and other SBCs if you haven’t already and get them prepared to act as nodes. A LOT has been learned over the past couple of weeks, lets get that know-how shared, refined and all in one location to make it as easy as possible for the largest number of folk to try running a node - This goes for desktops, laptops and cloud-based instances as well, not just Pis and other small interesting things
keeping a public list of a set of uploads for then testing on occasion that they are still retrievable - a statement of confidence in the networks ability to recall 100% over time might be useful.
That simple index, is predicated on its being practical from here… until the network can reliably tolerate a run of queries from a script I’m holding off … and it is most useful when there’s a browser available. Competitors are welcome - it’s not rocket science.
Add to the list - keeping a public list of a set of uploads for then testing on occasion that they are still retrievable - a statement of confidence in the networks ability to recall 100% over time might be useful.
I think it would be useful to have a simple performance gathering tool.
For example, a program which repeatedly generates, uploads and downloads files of several different sizes containing random data, saving the results to a CSV format text file as it goes, along with some output to the screen.
This would be a useful tool to compare performance network different testnets as well as to show how an individual test performed over time.
@MaidSafe do you already have anything along these lines which one of us could adapt if needed for generally use?
And as expected @happybeing read my mind on that one.
Will this be somewhere that Seq data will be useful?
I need to learn more about the capabilities of this, appending rows to a CSV? Or am I over-complicating?
I have a Pi 2b laying around that I hardly ever used. What OS image do you recommend for SN and how do I put it on there? Total linux noob here who just took his first steps on Ubuntu mind you.
I’m not that happy either, but I think the priority will be to upload as much as possible, “garbage” or not to generate chunks for new nodes to store. I’m sure if this changes, the devs will let us know soon enough.
According to the (old) architecture the most rewarded node will be the node closest to the median specification of all the nodes on the network. I’m not sure if there is any data about this yet.
I’m thinking that perhaps “uploading garbage” could be improved by keeping the list of all the xorurls obtained when uploaded, so later (after hours/days) can be tested/confirmed if the data is still there or not, perhaps just storing them in a text file which can then be fed into a second script to check if they are still on the network with safe cat.
Im working on a script that will scrape the xorurls on successful upload and then report them to a dedicated topic on here. So we can check each others homework…
Well its a concept rather than being “in work” right now…
How do I automate a post onto this forum? is only one of the questions I have right now
Also a script that would upload a set of std test files as mentioned above by @Michael_Hills and record when and to which xorurl it was stored. Then we can see how many times these std files were uploaded and hopefully gain some info on how well deduplication is working -also upload performance
Not sure if that could be that good assuming you’ll be generating many URLs…but if you keep them locally and hours later can be tested against the network that’ll be really good I think