Best Tools for Visualizing Disk Space on Large Arrays?

0
6
Asked By CuriousSquirrel92 On

I'm managing some pretty hefty disk arrays that are over 100 TB, and I need to figure out how to identify which data is taking up the most space so I can inform the users about their old files. I've tried WinDirStat, but it seems to struggle with these large arrays and often takes around 20 minutes to analyze. Is there a faster and more efficient tool out there for this kind of disk space visualization?

4 Answers

Answered By DataGuru88 On

If you're looking for something more interactive, check out this YouTube tutorial I found; it might help you find better solutions for managing and visualizing disk space on those big arrays.

CuriousSquirrel92 -

I have FSRM configured with soft quotas on the server. It's a tricky situation since they just want to be notified when they breach their limits, but they often ignore the alerts. I need something else that can keep track of where space is used and keep users informed ongoing.

Answered By ByteTraveler14 On

TreeSize Pro might be a step up for faster processing. Also, consider upgrading your storage system if possible; those new Solidigm 122.88TB NVMe drives could really speed things up!

CuriousSquirrel92 -

I actually looked into a 200 TB hybrid media storage solution, but funding got cut. These NSF projects come with their ups and downs.

Answered By TechieForLife01 On

TreeSize is a good alternative and tends to be faster than WinDirStat, but keep in mind that it will still take some time to process large arrays. Depending on the size, you might still end up waiting a bit.

CuriousSquirrel92 -

ok, thanks!

Answered By StorageWhiz99 On

It sounds like you need a more granular approach to managing resources. Without strict quotas, some units could take advantage of the system. It's essential to fix that from the ground up rather than just addressing the symptoms later.

CuriousSquirrel92 -

It's not as bad as it seems. This is only one research unit producing a lot of data. They really need to take responsibility for their storage. My biggest issue is helping them understand how long it takes to transfer such massive amounts of data! When I recently copied 60 TB over USB-C, it took weeks due to the 7 million files.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.