I'm having trouble with a script that exports a Hyper-V hosted VM and then copies the export to a NAS. The NAS path I'm using is formatted as "\192.168.10.10BackupsExports". While the export works fine, the copying fails when I run the script using MS Task Scheduler. It works perfectly in PowerShell during debugging. The task runs under the SYSTEM account with administrator privileges, and I've confirmed there's enough disk space. I figured out that the issue is likely due to the scheduler not having access to the NAS. Now, I'm unsure how to add credentials to my script to gain that access. Any suggestions?
4 Answers
Consider using [New-PSDrive](https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/new-psdrive) to map the NAS with valid credentials. Just remember, network drives are specific to user contexts.
If both the source machine and the NAS are domain joined, use the source machine's AD account to access the NAS. You’ll need to ensure that permissions are set for the AD machine object on the NAS so that it can connect successfully.
How are you authenticating for the copy? If your server is part of an Active Directory domain, use the fully qualified domain name to leverage Kerberos for authentication. If that’s not the case, consider using the New-SmbMapping cmdlet in your script to supply a credential for the connection.
The SYSTEM account is local to the server where the script is running, so it doesn't have a known identity on the NAS, meaning it can't access it.

Unfortunately, I can't do that because I get the "Multiple connections" error. I can't just disconnect the existing connection either.