One of those days I was going to double check some information about the backup tool I use (zBackup) when I notice it was not being updated anymore. It still works fine, but I prefer to use an updated tool to avoid any unpatched bugs. Searching for the new options, I ended up with two new candidates: Borg and Restic.
They both are deduplication tools, allow encryption, compression and have a check tool to make sure the backup is working. For what I noticed, Restic is better at connecting to different services and Borg is better with local backups, so I chose the last one.
So far, so good. But my first question was: which compression tool to use? With zBackup I didn't have too many options: either I used LZMA or LZO, so my option was LZMA. But Borg offers not only other tools, but also the compression level and I got myself stuck at this. I couldn't just guess a configuration when I need to backup hundreds of GBs every week. After a quick search, I didn't find any result that would satisfy my inner nerd during vacations: I had to test them! Not all of them, of course. I have other things to do during my vacations, but I wanted to run a few personal tests.
For the test data, I just pick one of my folders. One that was not so big, but big enough and with different files types to have a realistic sample. The raw folder had about 16.5GB. So, I started creating a simple backup using zBackup to have a reference and here are the approximate results:
Compact time | 40 min Test time | 10 min Size | 9.4 GB
Having that reference, I wanted to know what a fast-archiving option would be and run it using the auto, lz4 option that would only compact compactable files and, if so, using lz4:
Compact time | 13 min Test time | 6 min Size | 11 GB
Interesting results: way faster than the original tool but using a little bit more of disk space. Next test: using the auto, zlib 6:
Compact time | 13 min Test time | 5 min Size | 11 GB
Almost the same result as before. So, I've tried to remove the auto option, making Borg to compact every single file anyway:
Compact time | 18 min Test time | 6 min Size | 11 GB
As expected, the compact time were considerably longer, and the size was basically the same. So, my next test would be to increase the compression:
Compact time | 21 min Test time | 7 min Size | 11 GB
Still no big surprises here. I thought I could have a very small reduction of the backup size, but that's ok. Next test: using a different configuration: zstd 9
Compact time | 15 min Test time | 7 min Size | 9.9 GB
Now we are talking! A nice compact time with a smaller backup size. To improve it more, I just ran that again but this time, include once more the auto option:
Compact time | 15 min Test time | 3 min Size | 10 GB
Something I've also tested in a separate batch was the backup time after a simple update in the file structure. I am going to resume this here to not make this post so long, but not compacting unnecessary files had a HUGE impact for the subsequent backups sets, what makes me just stick to the auto configuration at the end.
And this was my final configuration: auto, zstd 9, a way fast result than my previous backup tool loosing just a little bit of compression size, what really made me happy at the end of the day. :)