Using the network to receive message/other notifications

Maybe eventually, but that’s probably a decade or so away, and won’t work everywhere. If I upload a 4 GB video from my desktop, I’d want my laptop to automatically sync it for later use when I don’t have internet.

If Safe doesn’t push file updates to user accounts, then surely another project will end up doing it instead. Like Dropbox, except instead of the app sending data to Dropbox servers, it will be sending it to Dropbox’s Safe account.

Do you mean you want all your 400,000 files on every device all the time? Dropbox does indeed do this for desktops AFAIK. It is not great though. For off-line (when will we get to never being offline?) then its best to not have the data cached but chunks instead. These are encrypted and obfuscated and useless without you being logged in.

The real thing is to never be offline and I realise that’s not great today, but I am encouraged by global networks, like satellite etc. Selective caching of chunks is certainly possible until then, however, if you are offline as you are flying then you could be forced to give your password over to unencrypt that data.

I am not sure what this part means, it does seem like you want all files on all of your devices. Perhaps you can clarify this.

3 Likes

Do you mean you want all your 400,000 files on every device all the time?

All 400,000 files on at least two devices (desktop and laptop), but with a subfolder just for my phone.

It is not great though. For off-line (when will we get to never being offline?) then its best to not have the data cached but chunks instead.

I guess locally stored chunks could work for this, since it’s still readable without internet. A lot of people think Dropbox is great. Its only real problems are being proprietary and centralised.

It seems like the intuitive solution is for the network to respond to file uploads by sending a notification to the uploader’s account, and to respond to file updates by sending notifications to all accounts that are subscribed to that file. No idea how tricky that would be to implement if it’s not already a core feature of course.

Then it would be up to the app to decide whether to redownload the file (most cases) or leave it unsynced (low bandwidth conditions, or low storage capacity phone).

Until everyone has internet everywhere (I don’t see this happening — underground trains, insulated music departments, countries under oppressive regimes, space flight, etc), syncing will remain popular.

Maybe this goes off topic though…

An app could do this, so combine nfs and messaging. When you change a file then it messages you. All your devices pick that up and store that file in cache locally on all devices where you are logged (you do not want to be logged in all over the place) in or will log in. That way you can have all your 400,000 files offline, either as chunks or the whole file. I think then you are in no better place than dropbox puts you really. Wide open and exposed on every device. I am sure some folk will want this though and an app could easily implement that.

3 Likes

Maybe this topic cointains also useful information:

PS Another pubsub system is MQTT. I wonder if stuff like ‘Last Will and Testament’ could be implemented.
Edit: something that the app should implement I guess.

1 Like

That sounds good mostly. But how would my account get notified when someone else’s account updates a file that I’m syncing? Wouldn’t that require them to have the app installed too?

1 Like

Ah that is private shared data, not just your own. This is a different issue. As long as you are happy with your own data part we can move on to that.

In private shared data a user (or app) would have to poll the network right now for changes. This could be made much better with push notification, which is entirely OK and possible. For that we need a pub/sub type mechanism, which is also OK. All of the mechanics are there for that.

The part that the core tech does not have right now is push notification. This is pretty easy though and is quite easy. When a client connects it will connect to its close group. They can now cache messages for that client on login. So that part is OK, however, you login to several devices and want to have all your data on them all. Atm this is not catered for in this scenario as your login would sync your data. An app would need to almost resend these messages again to the client group for the next device to login. I am not sure it is smart at all to have your actual data on machines like this, for this reason and many others.

So we are cool with push notification etc. that is easy, but syncing local data may require an app for that, but its not that simple. If you have web pages, blogs, messages, call transcripts, call media, files etc. then any app that creates those would need to also know this everything is also stored locally desire.

This probably needs more thought as to why locally store everything everywhere and sync?

5 Likes

An app would need to almost resend these messages again to the client group for the next device to login.

Hmm, we shouldn’t rely on every single device’s app working correctly… Couldn’t the push notification just be sent to the relevant account’s inbox rather than to a client? No-one besides the user would have to care if they want per-file syncing, as the notification would just be plonked in the inbox anyway. Unless I’m misunderstanding inboxes? If the goal is for people to store files on their hard drives less and less, then a push-to-client type of push seems kind of redundant except for farming.

Local storage is great for many reasons, and might well go out of fashion. But we should prepare for the inevitability that someone will succeed in restricting the internet in the future. Maybe satellites will be destroyed, maybe a new signal blocking technology will be employed somewhere, or maybe there will be laws that ban wifi in airports. I bet not many people predicted the Russia Telegram war.

1 Like

Yes, but then it becomes what inbox on what device? It’s not so simple really, but I feel that is because it is unsafe. There are ways to achieve this but probably not so much my area as I am opposed to all data on all devices as its a waste to me. I agree ubiquitous connectivity is not here yet, but I don’t feel it is far, otherwise Chromebooks etc. would not be selling.

So the prob here is each device would need to update any file changed from another device in “realish” time I suspect that will mean tying device ids of some kind to your account and almost have an inbox per device for sync of all this data. If it were only static data then perhaps you could run a daemon to check a bit like dropbox does, but that won’t cover all your emails etc. I think. It is possible though, but I imagine you have lost all the security and privacy at this stage and would be as well using a server-based 3rd party system. Of course, unless we can find some clever mechanism to sync all data on all devices.

I see it simpler to offline cache data though, like last accessed data and so on. That would just be static data as well mind you but could be helpful. Then messaging clients etc. could if they were designed to, have offline cache capability.

I still would not use that though as it would tie me to a device and I am pretty keen that devices hold absolutely no identifying data on anyone.

[Edit]
All of the above ignores privately shared group data. That gets difficult if you offline edit a doc that 100 online editors are working on. The sync process there is much more like a git conflict resolution.

2 Likes

I think that would be a decent balance for many people, assuming the local cache is also encrypted, etc. Using local storage as a cache seems like a good logical step forward for moving data to remote storage, while retaining decent performance and robustness.

For security critical usage, clearly this has negative trade offs though.

2 Likes

Couldn’t this all be solved by giving each account a decentralised inbox? One place where all devices regularly can check for all file updates and other notifications such as emails. Then it’s up to each device to view the latest notification every second (or however long makes sense given the network’s refresh rate) and act on it independently. Come to think of it, that seems necessary so that you can see when a shared file has been updated when using a public/friend’s computer.

Not sure what you mean here, an inbox per device? like having different accounts on each device that all share all the same data and also have the data stored locally on them?

A single account’s inbox stored on the network, so not on any device. Like normal network data. So even if all my devices disintegrate, I could connect another one and still see my previously messages. Like computers pulling emails and various other notifications from a centralised server in classical Internet, except the server would be Safe itself.

Yes, but would you get the latest. So say you have 2 devices with different states that needed to sync. If they did by logging in and getting all the latest stuff then you do need to differentiate the device and store something there or grab a cpuid etc. and tie a login to that somehow.

Maybe this is a bit confused (for me anyway) what if we bulletpoint your idea, I think it is

  1. Store data locally
  2. Have multiple devices able to be logged into 1 account
  3. As any device alters a file or gets a message etc. it should be synced to all your logged in devices
  4. As a device logs in, it must traverse all data on the network it has locally and sync that
  5. If there is a private share of data, then each device of every member logged in must sync any changes (IMHO any private share should then be considered unsafe, but that is another matter)

I think that is it, but perhaps you could clarify. There are a lot of side effects here if this is the list, like race conditions between devices and security etc. but perhaps we can take them one by one?

Hope this helps us to get an understanding

  1. As any device alters a file or gets a message etc. it should be synced to all your logged in devices

I’ll try to clarify what I mean here as it might be a source of misunderstanding.

I suggest that:

  • If the user alters a file on Device1, then Device1’s Safe app should handle uploading this to the Safe network. The Safe app would be logged in to Account1.
  • The Safe network then distributes the file across the world, and sends Account1 (but not any physical device) a notification message that the file has been altered by Device1.
  • Device1’s Safe app polls the Safe network for Account1’s messages, finds this new message, and then does nothing as it is Device1.
  • Device2’s Safe app (also logged in to Account1) polls the Safe network for Account1’s messages, finds this new message, and proceeds to download the new version of the file.

As a device logs in, it must traverse all data on the network it has locally and sync that

This doesn’t seem necessary, so here’s my alternative take:

  • Every time the Safe app uploads a file, it would store the latest upload time in a local file.
  • Every time the Safe app is started, it could ask its local filesystem which files have been edited (locally) since that time. Though this could be turned off by the user if necessary for any reason.
  • Any files that have been edited (hopefully none of them as the app should be running continuously) would then be uploaded to the Safe network.
  • The Safe app would then start polling the network for notifications.

If there is a private share of data, then each device of every member logged in must sync any changes

There would be no requirement for this, but it would make things easier for the individual members. If they turned off automatic message polling, they could still view the updates by clicking a manual ‘message poll’ button, or by visiting the file in a web browser or something.

Race conditions looks like it could be complicated. But hopefully that wouldn’t be too glaring an issue for normal usage. Giving each message an ID number so they can be ordered would help.

So the client managers would need to know all the devices and when they all say they have read the message then it can be purged. Each device would somehow mark that message as read.

  • We would need to make sure then that when you lose a device that you have a way of registering the loss, so the list does not go on forever.

Yes, I understand this part, but when you start an app it will need all those changes, but also any existing logged in devices would need them posted to it as well. It could be a lot of traffic, but doable.

Race conditions are hard, but this is also hard, it would need to be device id and number pair. Otherwise, the numbers could collide between devices. It does get very tricky. Then other devices need to be able to know when you add devices (could help security to only allow login form certain devices) or not, but if each device has the same auth and you leave one on, then your scuppered as the person changes your password and you are not near the device to stop that, but I digress. You could use a pin that removes the device after 3 failed attempts etc. but it gets more work fast this road.

For sync you can start down long roads like give every directory a hash and only traverse (sync) those with different hashes that you have at that root etc. But all this is to make your files on a device and that part is really hard to understand for me, it is the antithesis of safe to have multiple copies in my opinion anyway. I am not sure I am looking unbiasedly at your proposals due to this, but I am trying, I hope I am unbiased. I am tired, but trying to help, none of this will be anywhere near as simple as you imagine though. I Am pretty certain of that. No need to stop thinking and fixing that though.

So the client managers would need to know all the devices and when they all say they have read the message then it can be purged.

That’s sounds like a good way to do it. Another way of purging would be to tell any/each app to delete Account1’s notifications 100 days after reading them. It wouldn’t need any awareness of other devices for this.

Although, is it even necessary to automatically delete old notifications by default? They seem small and harmless, as long as notifications can be accessed by most recent first.

when you start an app it will need all those changes, but also any existing logged in devices would need them posted to it as well.

Unless Device1’s app didn’t finish uploading files edited/created on Device1 before being terminated, then there would be no changes. Device2 would only be affected by Device1 coming online if the user had been using Device1 for offline editing. No more traffic than downloading the new files any other way.

I have no real ideas about the race condition problem. I assume Safe will have brilliant mechanism to sort it out when it becomes a problem without syncing apps. Someone will try to update mutable data from two devices at once at some point.

If Safe accounts have decentralised notification trays that upload apps (and other apps, websites, etc) can be permitted to plonk notifications in, then I’m inclined to think that race conditions become a problem solely for upload apps to deal with rather than the network as a whole. I’ll give that more thought though…

For sync you can start down long roads like give every directory a hash and only traverse (sync) those with different hashes that you have at that root etc.

That could be useful occasionally, to make sure the app hasn’t malfunctioned and failed to update a file for some reason. But I can’t see it being necessary very often.

I might try to condense/clarify my suggestion to make it more readable. I’m pretty sure someone will find a way to make it work. If the infrastructure (just the account notification tray) is included in the network release, a lot of things will be possible that maybe weren’t before.

I appreciate you taking the time to explain it all in detail. Every discovered problem with the idea is a solution waiting to be found (​:

1 Like

The only thing to consider here is SAFE knows nothing about time and so far that has been critical, so that timer is dubious It is not natural to have a stopwatch, we can do better.

Agreed. However as discussed it will probably change from ‘constantly checking’ to ‘constantly waiting’ via a pubsub type service. Polling is not a sustainable model for this network.

I think a popular way to do messaging / editing on the SAFE network will be chains of operational transformations. This means editors “operate on their local copies in a lock-free, non-blocking manner, and the changes are then propagated to the rest of the clients; this ensures the client high responsiveness in an otherwise high-latency environment such as the Internet”

My idea of what would happen is 1) an initial sync of new transformations and then 2) the client opens a subscription to be notified of future changes (or start polling if pubsub doesn’t exist). This only happens for files in use since the phone is essentially a ‘view’ to the network; the files on the phone are not the data, just a starting-point for rendering the view. No point updating views that aren’t being viewed so I would generally think not all 40K files need updating.

Sure, not about time-in-seconds but it does know about the sequence datachain events which are a progression of events (aka time). I think having a pubsub timeout as a function of elapsed datachain events would work ok.


A bit of harebrained conceptualising on messaging:

The source>destination paradigm of messages is currently focused on client>network (and back again) but should also be able to do network>client. I think of websockets and longpolling on the current web… prior to that a server could not push data, only the client could pull. That’s fine, but not general enough for the messaging requirements of the SAFE network.

And furthermore, messages should be able to go file>file, ie a successful mutation to MD1 should be able to then be used as an input to a potential mutation of MD2. Currently this requires a client to poll MD1 for changes and then manually mutate MD2, but it would be great if MD2 could itself subscribe to changes in MD1 and the network messages directly to MD2 when MD1 is changed, no client needed. ie MD2 becomes a client subscribed to MD1.

In other words, inputs for mutations can currently only be signatures. That’s fine. But eventually having other mutations as inputs would be interesting and lead to a design of messaging that is more fully generalised.

Everything is an xorname. Thus everything should be able to ‘be a file’ and ‘be a client’ and ‘be a message’. There should not be too much distinction between these things. :face_with_raised_eyebrow:

2 Likes