I've taken over a shared drive setup where users are assigned a drive letter mapped to a server via a logon script. This structure is based on their membership in over 1,200 Active Directory security groups, giving access to more than 750 parent and subfolders. The drive has grown to over 17TB, and there are no retention policies based on file age or type. Users can request new folders for any purpose, which has led to a mix of essential current business data and large old files, like 300GB of digital copies from a long-ago Christmas party. I need a solution that can automatically archive older data at the file level and leave shortcuts behind, instead of just moving entire folders as seen with FSRM+DFS. Does anyone know of any products or solutions that can handle this?
6 Answers
For your specific needs, check out products like QStar Network Migrator. These types of solutions are designed to stub individual files rather than folder-level archiving, which seems to fit your requirements perfectly. Various companies in the LTO tape space might also offer similar options, so it’s worth exploring those.
You might look into tiered storage solutions. Companies like Commvault offer systems that automatically move files to cheaper storage after a set period while replacing them with placeholders. This way, users don’t notice any significant changes in access, even if there’s a slight delay in retrieving older files.
You might want to look into using PowerShell for this. Start by ensuring you can restore backups easily. Then, you could set up a Windows Server, perhaps a newer engine like 2025. Create a ReFS volume, or stick with NTFS if you prefer. Schedule a task to move files that haven't been accessed for a while—let’s say five years. You can use commands to generate a list of these files and move them, creating shortcuts if necessary. Just keep in mind that shortcuts can complicate things a bit, especially if you need to manage them later on. Make sure to test everything before fully executing it!
It might not even be necessary to automate the process continually. A one-time manual archiving pass could clear a lot of space, and then you could leave it be for quite some time, maybe even a decade.
Komprise is another product you might want to investigate. It could help with your archiving needs effectively!
Nassuni and Commvault are both good options to look into. They provide enterprise-level file archiving and can work with stub files, making it easy for users to keep accessing what they need without noticing the backend changes.

Related Questions
Can't Load PhpMyadmin On After Server Update
Redirect www to non-www in Apache Conf
How To Check If Your SSL Cert Is SHA 1
Windows TrackPad Gestures