In the past few weeks, I have been using restic to backup my data.
Restic is a fast, secure, and efficient backup program that supports a variety of backends, including local storage, S3, and Google Cloud Storage.
Since I am a home lab person and owner of a Synology NAS (unfortunately, but this is a story for another post), I got frustrated with the lack of a good backup solution (Hyper Backup, I’m looking at you).
Of course, not having a way to see how the backups are being performed and if they are successful or not is a big no-no for me. So, I decided to use Apache Airflow to schedule, orchestrate and monitor the backups.
I will in another opportunity write about how I set up Airflow, but for now, let’s just share the initial project that I built: restic-airflow.
The operators are very simple and they rely on DockerOperator
from Airflow to run the restic commands (backup, check, forget, prune, etc). This approach is to make the project more portable and easier to run on different environments.
Hope that this can be useful to you as it has been for me. If you have any questions or suggestions, reach out to me on my contacts on About page!
Stay tuned for more updates!