Before I forget here’s the server flow for my 64tb sever. Supports Anime and all the works, never could get manga working with torrent/usenet well enough on ubuntu though.

A short list:

  • Anime/Tv/Movies
  • Switch games management (kinda)
  • Cross Seed
  • Unpackerr
  • Radarr + Sonarr Queue Cleanup
  • Trakt Sync
  • Trakt List to add sonarr item (bad practice but whatever)
  • Comics (weekly bundles are OP)
  • My shitty cronjobs

Make suggestions on improvements, I probably won’t be using this settup on my next server, but similar.

For adding content I used Ombi and the plex watchlist sync feature for those that were leeching on my plex, worked well enough. For better management I used the LunaSea app (great fucking app, go get it now, it’s free)

I didn’t do music bc I have tidal with plex and that’s more then fine, lidarr sucked too much for the artists I like and attempts at streamrip automation failed all the time.

Cronjob abuse is my friend

Forgot to mention this also supports auto-uploading content (with filters) on a cronjob

You are viewing a single thread.
View all comments
5 points
*

I would really like to automate my workflow and organize my library, but I like to seed things forever. How do you automatically retrieve metadata to reorganize folders and filenames, while still being able to seed? Is creating a second copy of the files the only way, or is there something I’m missing?

permalink
report
reply
7 points

Basically yes. You use *arr to find releases and make a copy with proper naming and metadata when a download finishes. On its own, that would not be great as you would double the size of everything. Except you use hard links. Those are kind of like shortcuts, but both the shortcut and original are the same thing. Both point to the same data on disk. In fact, they’re indistinguishable from each other. If you delete one, the data remains as there is another link pointing to it. If you delete both, the data gets deleted. Basically they are free copies. You just have to make sure your file system supports them

permalink
report
parent
reply
2 points
*

I’ve started reading the guide on the subject. So now my problem is that I have different zfs datasets separating my library, and I suspect hardlinks won’t work across them. So I’ll have to rethink how I organize my filesystem.

permalink
report
parent
reply
1 point

Yep, hard links only work within the same filesystem. You can have multiple drives in raid that form a single partition and use hard links within the array.

permalink
report
parent
reply
1 point

Couldn’t have said it much better myself. I think of hardlinks like backend and front end development offices.

The backend team has the data that frontend teams 1 & 2 use, but frontend team 3 isn’t in the same office (filesystem) as the backend team so they can’t access the data.

If frontend team 2 goes down, frontend team 1 still has access to the backend’s data

Writing new data: teams 1 and 2 go down then fuckit we can bulldoze backend and make a new backend for any new frontend teams.

permalink
report
parent
reply
3 points
*
Deleted by creator
permalink
report
parent
reply

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

!piracy@lemmy.dbzer0.com

Create post
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don’t request invites, trade, sell, or self-promote

3. Don’t request or link to specific pirated titles, including DMs

4. Don’t submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-fi Liberapay

Community stats

  • 3.7K

    Monthly active users

  • 3.3K

    Posts

  • 78K

    Comments