Welp, I’m ready to start writing my backup script!
I ended up using Python, mostly because I feel more comfortable using it. I considered Bash, and perhaps in the future I will convert the process to Bash. But as I said, I am more comfortable in Python, and, at least for now, think Python could be better. We shall see. Bash may be more ubiquitous, but I’m not planning on ever running this on any exotic hardware, so Python is a reasonable requirment.
At first, I had one main monolithic script that interacted with every application. This was “easier” because it was a single file to edit, but it got very busy very fast. It meant that if I ever changed the containers that I had, I would have to scrub through and remove all references to that container. Additionally, I eventually want to do rotational backups (daily, weekly, monthly, etc), so thinking about coding all that logic into this main script became a bit overwhelming. I ended up breaking it up so one backup script handles one applications requirements. Each of these individual backup scripts will be called one by one from a main script. Currently that main script is just a Bash script that calls each Python file like so:
#!/bin/bash
/docker/authentik/backup.py
/docker/calibre-web/backup.py
/docker/collabora-code/backup.py
.
.
.
This allows for flexability with each application’s backup needs. It means that each applications’s backup needs are self-contained and not intertwined with any other application, so if I remove a container, all I have to do is remove the call in the main script, and it won’t get backed up. Hopefully, this will scale well and be more maintainable. Only time will tell.
Within each of these backup scripts I have logic for backing up the database (if applicable), deleting any databse backups older than 30 days, and encrypting the .env files. Since these tasks are standard accross the containers and the logic is the same, I created a Python psudo-module that contains all this logic that is simply imported in each script. I couldn’t find out exactly how to make it an actual module and perform a clean import, so I end up modifying the Python PATH first with this code:
#/docker/authentik/backup.py
#!/usr/bin/python3
import sys
sys.path.append('/docker/backup')
import backup_utils as bu
Then in /docker/backup I have a file called backup_utils.py that contains all the common functions. I will eventually push all my Docker config files to my GitHub, so you can view them there.
It was also at this time that I started using git. I had used it before, mostly through VSCode, never really through the command line. I won’t go into detail here since all I did was basics like git init, git status, git branch, git checkout {branch}, git add {file}, and of course git commit -m {messsage}.
When I innitially set about doing this project I thought I could create TrueNAS snapshots via API. While that is true (sorta, they use websockets which is a whole different thing than I was expecting), these manually created snapshots are not part of any automated lifecycle tasks and so would have to be manually deleted (through either the GUI or API). I decided that was too much complexity, so instead I will simply schedule the script to run before the snapshots, that way the databases are as consistant with the filesystem as possible.
With all that said, stage 2 of my backup plan is complete! Next on the list is actually storing all these backups well.