What’s the best way to extract multiple zip files from a directory structure?

0
9
Asked By CuriousChicken88 On

I've got a huge archive filled with files organized in a nested folder structure, and I need to extract thousands of zip files from it. The archive is pretty massive—over 60GB, mostly made up of tiny files—so I want to do this fairly quickly. I don't care about keeping the original folder structure; I just want all the files extracted. Any suggestions for how to do this efficiently?

3 Answers

Answered By HelpfulHedgehog11 On

If you’re dealing with folders that just contain zip files (not zip files within zip files), the `find` command is your best friend! You can use it like this: `find filesFrom2008 -type f -iname "*.zip" -exec unzip {} ;` and that should grab all the zip files and unzip them for you. That's super simple and effective!

CleverCat22 -

Good to know! I was worried about handling nested folders, but if it's just zips in directories, I should be good. Thanks!

Answered By OptimisticOtter99 On

For better performance, especially with that many files, you might want to use `xargs` to run multiple unzip processes in parallel. Give this a shot: `find . -type f -iname '*.zip' | sed 's/.zip$//' | xargs -i -P $(nproc) unzip -d '{}' '{}.zip'`. This way, you can make use of all your CPU's cores, which should speed things up a lot!

Answered By ResourcefulRaccoon77 On

If your files are really tiny and you expect to deal with a lot of them, using `xargs` with `-P` to set the number of processes can save you a ton of time. Just make sure to adjust the pipe correctly to avoid issues. Parallel processing is key here!

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.