I'm working with a lengthy PowerShell script that navigates through multiple forests and domains to collect user and group information, exporting everything to organized CSV files. However, with over 100 forests and sizes reaching around 3,500 groups for the first domain alone, I really need to parallel process to finish these reports daily.
I'm encountering some issues within the Start-DomainJobs function that I could use help with.
1. When I try to call the Log-Activity function in the group membership section, I get an error saying "Log-Activity isn't a valid cmdlet". I think it's not being passed through correctly, but it's included in the scriptblock.
2. When the enableAllGroups option is off, and I pull data from the CSVs without a hitch, I get an error stating "The term 'Import-Module' is not a valid cmdlet." This confuses me since the user export works fine, suggesting the module loads properly. How can Import-Module fail in this scenario?
3. The major issue: I'm getting an error when trying to use Remove-Job, indicating that the job with ID 1 can't be removed because it hasn't finished. I believe my throttling should wait until achieving the $throttlelimit of 30 before adding new jobs, so I'm puzzled about where I went wrong.
4. Finally, I'm also encountering an error saying "Method invocation failed because Threadjob does not contain a method named op_Addition." I suspect this is tied to the previous issue of not being able to remove the running job, resulting from a flaw in my throttle logic.
If anyone has insights or suggestions on how to resolve these issues, I'd appreciate it!
1 Answer
It might be beneficial to separate the job management and your main task. Instead of handling multiple domains in one script or job, try creating a script that processes one domain at a time and launches each script in a new PowerShell process. This way, jobs won't share state information, and you can avoid tangled dependencies.
That's an interesting idea! I could make each report a separate script accepting command-line parameters and have a master script loop through them, passing the arguments. But I'm a bit worried about centralized logging and the user context—does each child process inherit the same context as the parent GMSA?