I'm searching for some good tools to help clean up a massive file share that's over 4TB in size and has been around since the early 2000s, running on Windows Server 2019 Datacenter. Ideally, I need something that can automatically archive files that haven't been modified in the last five years into a new locked-down file share for auditing. Are there any AI tools that could help identify duplicates or other useful features while tackling this project?
4 Answers
While AI isn't necessary for this task, it's worth mentioning that using traditional methods has worked just fine for finding duplicates and old files. Tools like DFD7 are great at eliminating duplicates, while RED helps to clear out empty directories.
Powershell is your best friend for this kind of task. You can script everything to automate the file archiving. For detecting duplicates, I recommend a tool called Fast Duplicate File Finder—it’s effective and might be affordable. If you’re looking for a free option, check out dupeGuru!
You might also consider globbing a list of files by their modification dates and then archiving them with a single script. It's pretty straightforward to do, especially if you're familiar with scripting.
If you're using Windows Server, the File Server Resource Manager (FSRM) could be a good option for automating cleanup and archiving tasks. Just make sure you set it up correctly to target the right files based on their modification dates.

Related Questions
Can't Load PhpMyadmin On After Server Update
Redirect www to non-www in Apache Conf
How To Check If Your SSL Cert Is SHA 1
Windows TrackPad Gestures