I'm working on a project to migrate a customer's data from an on-premises file server (using SMB) to Azure Blob Storage. The main motivation behind this shift is that they're re-developing their application to utilize Blob, which is a more cost-effective solution. The total data size is about 30TB. I've attempted to use AzCopy to transfer 2TB, but it severely impacted the server, managing to only copy 8% of the data due to the limitations of the internet connection. Now, I'm considering using Azure Databox Disks for the initial transfer. After that, what would be the best way to keep the data in sync with any changes on the source? Would tools like AzCopy Sync or Azure Storage Explorer be effective for this?
1 Answer
You might want to check out the Azure Storage Mover service. It orchestrates AzCopy jobs on appliance VMs running on-prem, allowing you to control the transfer rate so your network doesn't get overwhelmed. This could be a great solution for your migration!
Does this service require the storage account to have a hierarchical namespace enabled? I remember that being a prerequisite before.