I'm currently using the SharePoint Migration Tool to migrate user HomeDrives to OneDrive, especially for users who have over 100,000 files. While attempting to automate the process via PowerShell, I encountered several 403 errors in the logs. To address this, I created a CSV to manage folders with fewer than 20,000 files for migration. However, I still faced some errors in a few tasks while using the GUI version of the tool. I managed to bypass these issues by restarting specific tasks, but I'm curious if there are ways to avoid these errors altogether. Here are the error messages I've encountered: ErrorCode: 0x0201000F – Web Issue when doing SP Query: Unable to connect to the remote server. Only one usage of each socket address (protocol/network address/port) is typically permitted. ErrorCode: 0x0201000E – Invalid SharePoint on-premise subfolder path: Unable to connect to the remote server. Any suggestions?
2 Answers
That error often pops up when the system running the migration tool starts running out of available ephemeral ports. When you're migrating large batches or using high concurrency, the SPMT can establish many outbound connections, leading to this limit being hit. Reducing the SPMT concurrency is a great first step, and the registry tweaks mentioned are common fixes in these situations. I've also found success by splitting migration jobs into smaller batches instead of pushing large sets at once. Even though the tool supports hefty runs, breaking them down often provides more stability and minimizes hitting connection limits and throttling issues. Lastly, ensuring there's enough time between jobs for connections to clear can help a lot; running everything back-to-back quickly can lead to a pile-up of TIME_WAIT sockets. We had much better results once we reduced concurrency and staged migrations in smaller chunks.
You might want to tweak the MaxUserPort and TcpTimedWaitDelay registry settings and also lower the concurrency settings in the SPMT. This could help mitigate those connection errors you're seeing.

Thanks for the info! Just curious, what do you usually consider a "large" folder set? I've got a few users with around 500 tasks in a batch that might trigger these issues.