Title is TLDR. More info about what I’m trying to do below.
My daily driver computer is Laptop with an SSD. No possibility to expand.
So for storage of lots n lots of files, I have an old, low resource Desktop with a bunch of HDDs plugged in (mostly via USB).
I can access Desktop files via SSH/SFTP on the LAN. But it can be quite slow.
And sometimes (not too often; this isn’t a main requirement) I take Laptop to use elsewhere. I do not plan to make Desktop available outside the network so I need to have a copy of required files on Laptop.
Therefor, sometimes I like to move the remote files from Desktop to Laptop to work on them. To make a sort of local cache. This could be individual files or directory trees.
But then I have a mess of duplication. Sometimes I forget to put the files back.
Seems like Laptop could be a lot more clever than I am and help with this. Like could it always fetch a remote file which is being edited and save it locally?
Is there any way to have Laptop fetch files, information about file trees, etc, located on Desktop when needed and smartly put them back after editing?
Or even keep some stuff around. Like lists of files, attributes, thumbnails etc. Even browsing the directory tree on Desktop can be slow sometimes.
I am not sure what this would be called.
Ideas and tools I am already comfortable with:
-
rsync is the most obvious foundation to work from but I am not sure exactly what would be the best configuration and how to manage it.
-
luckybackup is my favorite rsync GUI front end; it lets you save profiles, jobs etc which is sweet
-
freeFileSync is another GUI front end I’ve used but I am preferring lucky/rsync these days
-
I don’t think git is a viable solution here because there are already git directories included, there are many non-text files, and some of the directory trees are so large that they would cause git to choke looking at all the files.
-
syncthing might work. I’ve been having issues with it lately but I may have gotten these ironed out.
Something a little more transparent than the above would be cool but I am not sure if that exists?
Any help appreciated even just idea on what to web search for because I am stumped even on that.
Easiest for this might be NextCloud. Import all the files into it, then you can get the NextCloud client to download or cache the files you plan on needing with you.
hmm interesting idea. I do not get the idea that nextcloud is reliably “easy” as it’s kind of a joke how complex it can be.
Someone else suggested WebDAV which I believe is the filesharing Nextcloud uses. Does Nextcloud add anything relevant above what’s available from just WebDAV?
I’d say mostly because the client is fairly good and works about the way people expect it to work.
It sounds very much like a DropBox/Google Drive kind of use case and from a user perspective it does exactly that, and it’s not Linux-specific either. I use mine to share my KeePass database among other things. The app is available on just about any platform as well.
Yeah NextCloud is a joke in how complex it is, but you can hide it all away using their all in one Docker/Podman container. Still much easier than getting into bcachefs over usbip and other things I’ve seen in this thread.
Ultimately I don’t think there are many tools that can handle caching, downloads, going offline, reconcile differences when back online, in a friendly package. I looked and there’s a page on Oracle’s website about a CacheFS but that might be enterprise only, there’s catfs in Rust but it’s alpha, and can’t work without the backing filesystem for metadata.