I currently use an Ubuntu distro (Mint) with Plex installed serving media to my local network. I installed docker last night so I can test out some photo hosting services.
Is there a benefit to running Plex through docker vs the traditional method?
I currently use an Ubuntu distro (Mint) with Plex installed serving media to my local network. I installed docker last night so I can test out some photo hosting services.
Is there a benefit to running Plex through docker vs the traditional method?
Just out of curiosity, do you backup all the generated metadata too? I backup my Plex folder from my SSD to my unraid array weekly, and it takes quite a long time to get through.
Yep, everything that’s attached to plex I backup (excluding media, that’s done elsewhere). There are tens of thousands of tiny files which is why it takes so long, I do recommend taring it first. The archive must support symlinks which is why I chose tar.
I do tar it first (actually I tar my entire docker share), but it still takes longer than I would like.
Just the way my backup system works, All my docker containers go down while I’m backing them up, so it’s a bit of a minor inconvenience (even though it only happens on off peak hours in the middle of the night)
Really not a big problem, just thought I’d check if you had a better solution.
Yeah, that’s because every little file and image need to be stored, and there’s tens of thousands of them. Nothing wrong with it, just will take time. I usually do my plex DB on a weekly cron in the middle of the night
You might want to look at how you’re backing up. Something like BorgBackup or better RSync scripts would benefit this process by only taking differences instead of a full backup. Unless you’re constantly adding and deleting media daily, a weekly Plex backup shouldn’t really take a long time.