Title is TLDR. More info about what I’m trying to do below.

My daily driver computer is Laptop with an SSD. No possibility to expand.

So for storage of lots n lots of files, I have an old, low resource Desktop with a bunch of HDDs plugged in (mostly via USB).

I can access Desktop files via SSH/SFTP on the LAN. But it can be quite slow.

And sometimes (not too often; this isn’t a main requirement) I take Laptop to use elsewhere. I do not plan to make Desktop available outside the network so I need to have a copy of required files on Laptop.

Therefor, sometimes I like to move the remote files from Desktop to Laptop to work on them. To make a sort of local cache. This could be individual files or directory trees.

But then I have a mess of duplication. Sometimes I forget to put the files back.

Seems like Laptop could be a lot more clever than I am and help with this. Like could it always fetch a remote file which is being edited and save it locally?

Is there any way to have Laptop fetch files, information about file trees, etc, located on Desktop when needed and smartly put them back after editing?

Or even keep some stuff around. Like lists of files, attributes, thumbnails etc. Even browsing the directory tree on Desktop can be slow sometimes.

I am not sure what this would be called.

Ideas and tools I am already comfortable with:

  • rsync is the most obvious foundation to work from but I am not sure exactly what would be the best configuration and how to manage it.

  • luckybackup is my favorite rsync GUI front end; it lets you save profiles, jobs etc which is sweet

  • freeFileSync is another GUI front end I’ve used but I am preferring lucky/rsync these days

  • I don’t think git is a viable solution here because there are already git directories included, there are many non-text files, and some of the directory trees are so large that they would cause git to choke looking at all the files.

  • syncthing might work. I’ve been having issues with it lately but I may have gotten these ironed out.

Something a little more transparent than the above would be cool but I am not sure if that exists?

Any help appreciated even just idea on what to web search for because I am stumped even on that.

You are viewing a single thread.
View all comments
7 points

Easiest for this might be NextCloud. Import all the files into it, then you can get the NextCloud client to download or cache the files you plan on needing with you.

permalink
report
reply
3 points

hmm interesting idea. I do not get the idea that nextcloud is reliably “easy” as it’s kind of a joke how complex it can be.

Someone else suggested WebDAV which I believe is the filesharing Nextcloud uses. Does Nextcloud add anything relevant above what’s available from just WebDAV?

permalink
report
parent
reply

Nextcloud AIO in docker is dead simple and has been reliable for me.

The sync client is capable of syncing the whole tree as remote pointers that didn’t take space until you access the file then it downloads it local. You can also set files to always be local.

permalink
report
parent
reply
3 points

Nextcloud has a desktop app that can sync folders.

permalink
report
parent
reply
5 points

I’d say mostly because the client is fairly good and works about the way people expect it to work.

It sounds very much like a DropBox/Google Drive kind of use case and from a user perspective it does exactly that, and it’s not Linux-specific either. I use mine to share my KeePass database among other things. The app is available on just about any platform as well.

Yeah NextCloud is a joke in how complex it is, but you can hide it all away using their all in one Docker/Podman container. Still much easier than getting into bcachefs over usbip and other things I’ve seen in this thread.

Ultimately I don’t think there are many tools that can handle caching, downloads, going offline, reconcile differences when back online, in a friendly package. I looked and there’s a page on Oracle’s website about a CacheFS but that might be enterprise only, there’s catfs in Rust but it’s alpha, and can’t work without the backing filesystem for metadata.

permalink
report
parent
reply

Linux

!linux@lemmy.ml

Create post

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word “Linux” in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

  • Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
  • No misinformation
  • No NSFW content
  • No hate speech, bigotry, etc

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

Community stats

  • 7.2K

    Monthly active users

  • 6.4K

    Posts

  • 176K

    Comments