I'm trying to run a cksum on a directory that consists of multiple subdirectories filled with numerous files. I want to compute a single checksum value for everything inside instead of getting individual values for each file. This is important because comparing hundreds of lengthy checksum values and file sizes manually would be very time-consuming. Additionally, I can't risk altering the directory, so using tar or zip is off the table. Is there a way to achieve this?
3 Answers
Have you thought about using `find` paired with `xargs` to generate checksums? You could make a list of all relevant files and pipe that to a tool like `md5sum` or `shasum`. Then, you could output those checksums into a temporary file and compute a checksum on that file. This way, you get a single value to work with without altering your original directory.
Another option is to look into the `bagit` format if you are focusing on digital archiving. This format lets you create a package of your files and their checksums, which can help ensure data integrity without altering anything in your current setup.
It sounds like your ultimate goal is to verify that two directories contain the same data, right? If so, maybe instead of using `cksum`, you could use `md5sum` or `shasum`. These tools are generally considered more efficient for this kind of checksum calculation because they produce smaller, more manageable hash values. The whole idea is you could compute the hashes of the files and compare them in a less overwhelming way.
Related Questions
Can't Load PhpMyadmin On After Server Update
Redirect www to non-www in Apache Conf
How To Check If Your SSL Cert Is SHA 1
Windows TrackPad Gestures