Can I ask where datamaps are visible?.. rather surprised that xorurl and dog are not obviously including a simple sense of what is happening and why… I wonder that might help understanding alongside being useful for app devs. Searching give a sense of those but unclear without a test network more what multimaps and nrsmaps are to be sure of. Useful later some end to end description with the cli and.or api options to help evidence and track the process of what is expected.
Thanks again to all for input and perseverance on hammering hard topics… looks like good progress again this week
We’re looking at concepts around a stable set of nodes
…isnt this a bit “late”? I mean this is “basic” (theoric?) important concept for functional network. I mean it affect on me that you maybe programming something that mby woudlnt work and then finding “basic” problems which could? be solved at first.
I just ask it isnt offensive post, rather maybe my prespective for consideration.
Thanks to all for the hard work to get us this far. I hope we can now get the token distribution put to bed. There have been a lot of changes so its up to us as a community to “trust but verify” the devs and do as much testing as we can to validate their work. We need further discussion on how the test net regulars can contribute most effectively here and how we can entice others to join in this testing. We just got a new release so - for a while- it will be relatively simple to join in as there will be no need to install rust and build from source.
However - even if you are not comfortable with joining the test nets you can play your part by checking out the ChatGPT documentation project where non-coders can make a real contribution to the project by using ChatGPT^ to generate comments on all of the source code in the various repos at MaidSafe · GitHub - eventually…
This needs a project leader - for many reasons that should not be me - their role will be to co-ordinate a few users to quickly learn enough of ChatGPT and its API - links in the original post - to find the correct phraseology to get ChatGPT to produce comments at an agreed level of detail. Once that is done, the work needs split into manageable chunks for a team of helpers to process and then collate into a sutitable form - and in various languages.
If this sounds vague, thats cos it is, I dont want to pre-empt whoever takes this on as leader - I just think using these newly available tools to produce badly needed docs is a good idea. It needs a bit of planning and a lot of slog- which can be minimised by intelligent use of the ChatGPT API, followed by collating all the output and ensuring eventually we have coverage of all relevant repos - and a process to keep these docs current.
I know just enough to know what I dont know and where others have better skills than me so instead of me kicking this off and then calling for help when I get out of my depth its best to find someone else to lead it from the start.
I will contribute when I can but I have lots on my plate right now.
Having adequate documentation is a big plus not only for our own use but in promoting the project and bringing in other dev talent. If we are going to make SAFE a success then as well as a working network we need visibility. Good documentation is key to that AND helping to provide these docs is now longer the preserve of pure Rust geeks - ChatGPT seems to have liberated us from that so lets make the fullest use of it.
You want to help move SAFE forward but feel you dont know enough/any Rust to be useful?
Here is your chance - either as the project leader or as one who will run the scripts against the source files and help collate output.
YOU can make a real difference here. Please check out the links and consider helping.
Thanks to all who have already indicated a willingness to assist here.
^ other AI tools are available and their use should be considered also.
What that probably meant to say was the time passed for each increment is not linear, but exponential.
Exp and log are the inverses of each other so can be formulated in both ways. But that formulation there would need an update.
Great to read about experimentation with tinier virtual machines and small nodes.
I wonder if these stable nodes could potentially also tap into different storage medium for the same data? If my hardrive holding the data corrupts, could the node serve the same data stored on my dvd or usb or does a churn event change the data?