I've previously worked with Azure Automation Accounts and runbooks on a hybrid worker setup, where it was simple to drop my module folder and .psm1 file into the PowerShell directory. Now, I'm looking to find a proper way to utilize my custom modules in a cloud-only automation account. For instance, I have a Graph module that I reference in several scripts using `Import-Module "GraphModule.psm1" -Force`. I want this module to be accessible for my runbooks and also want it to be source controlled via our Git repository. I understand that while I could store it as a .ps1 in the repo, I'm uncertain how to get the runbook to find and import this module correctly. Is this feasible?
1 Answer
To make your custom module work in Azure Automation, you’ll need to set up some kind of continuous delivery process. In Azure, you can upload custom modules or grab them from PSGallery directly. I recommend syncing your Git repository with Azure using a script – whenever you commit to the main branch, trigger a PowerShell script that pulls the latest code from your Git repo and uploads it to Azure Automation. This way, your module stays up-to-date with your repo.

So, if I understand correctly, you’re suggesting to keep my custom module in the Git repo and use a separate process to upload it to the Automation account instead of using the built-in source control sync?