I'm currently using the 'write-log' command to create local log files on my client machines. However, I'm looking for a better way to log these activities to a central server. The main goal here is to have all client logs consolidated so I can view them in a web interface. Has anyone set up a system like this before, or do you have any tips and suggestions for doing this?
5 Answers
If you’re using a SIEM, logging to the event log and reviewing through it can be effective. Otherwise, you could set up a central server using syslog or another event management system to collect logs. Alternatively, creating a database with a simple REST interface might suit your needs too.
You might want to consider logging to the Windows Event Logs through Group Policy or directly in your scripts. We have a setup where our log server uses built-in tools to generate reports for PowerShell activities. It's super efficient!
Depending on your needs, I’ve logged to an SQL table, used a shared folder, or even directed logs to a Splunk indexed folder. Each has its pros based on what kind of data you’re dealing with.
For more advanced setups, I recommend using a Docker syslog container. You can point your logs to it using something like Posh-SYSLOG. I've had success implementing Azure Log Analytics for handling logs, particularly with Arc servers, and it can be adaptable for log consumption too.
Another good option is to write directly to the event log and implement Windows Event Forwarding (WEF) to aggregate your logs. You can also look into using a service like Splunk, which has an agent that can help with this.

That sounds great! We also utilize PowerShell logging and transcripts via Group Policy directed to a network share, so our SIEM can easily ingest the logs. As for our setup, we use a job scheduler for script runs, which simplifies tracking since it logs everything in real-time.