I've got a huge archive filled with files organized in a nested folder structure, and I need to extract thousands of zip files from it. The archive is pretty massive—over 60GB, mostly made up of tiny files—so I want to do this fairly quickly. I don't care about keeping the original folder structure; I just want all the files extracted. Any suggestions for how to do this efficiently?
3 Answers
If you’re dealing with folders that just contain zip files (not zip files within zip files), the `find` command is your best friend! You can use it like this: `find filesFrom2008 -type f -iname "*.zip" -exec unzip {} ;` and that should grab all the zip files and unzip them for you. That's super simple and effective!
For better performance, especially with that many files, you might want to use `xargs` to run multiple unzip processes in parallel. Give this a shot: `find . -type f -iname '*.zip' | sed 's/.zip$//' | xargs -i -P $(nproc) unzip -d '{}' '{}.zip'`. This way, you can make use of all your CPU's cores, which should speed things up a lot!
If your files are really tiny and you expect to deal with a lot of them, using `xargs` with `-P` to set the number of processes can save you a ton of time. Just make sure to adjust the pipe correctly to avoid issues. Parallel processing is key here!

Good to know! I was worried about handling nested folders, but if it's just zips in directories, I should be good. Thanks!