I'm on the hunt for a Unix archiving tool that can handle file-level backups and generate a single, browsable compressed archive. I'm not looking for abstract backup solutions like Kopia. On Linux, I've come across a few options, namely 7-Zip, TAR with XZ or Zstd, and DAR. What intrigues me about DAR is its cataloging feature, which can export metadata separately from the archive, allowing for the browsing and extraction of individual files without needing to decompress everything. It does compress files individually, which could lower the compression ratio but increase resilience. Given that DAR has been around for decades and is included in many Linux distributions, I'm curious if anyone has practical experience using it in production or has any thoughts about its reliability.
2 Answers
With TAR, the advantage is flexibility. You can tweak your scripts to do practically anything once the TAR file is created. And if you're looking for something that can be browsed easily, the command line zip tools on Linux are also solid options.
Honestly, I’ve never seen DAR used in a real production environment. It’s pretty obscure, and most teams just default to TAR with zstd because it’s well-known and reliable. Seems like a safer bet, especially for mission-critical setups.

I get why you'd think that, considering DAR's history and its cool catalog feature. But I'm with you—if the knowledge of how to maintain it rests with just a few people, that oversight could be risky.