How to Send EKS Audit Logs to an S3 Bucket in 2025?

0
0
Asked By TechieD00d93 On

I've looked into various methods for sending EKS audit logs to an S3 bucket, but most of the resources I found are outdated. Can anyone share the best practices or updated methods for doing this in 2025?

4 Answers

Answered By DataNinja42 On

From what I've researched, EKS audit logs go straight into CloudWatch Logs. To get them into S3, you'll typically need to pull the logs from CloudWatch. I prefer using Kinesis Firehose for this purpose, especially if you need to format the logs for tools like Splunk or Azure Sentinel via Lambda.

Answered By KubeMaster88 On

Have you considered using the Kubernetes logging operator with Fluentd and Fluent Bit on your worker nodes? You can send logs directly to S3 from there. Check out this example for more details: https://kube-logging.dev/docs/examples/

Answered By LogGuru101 On

I delved into this topic too. EKS audit logs land in CloudWatch, and to store them in S3, you might have to create a custom solution, like a Lambda function. However, if the high cost of CW log ingestion is a concern, going CW -> Lambda -> S3 won't solve that. You might want to check out this GitHub issue for more insights: https://github.com/aws/containers-roadmap/issues/1141

S3Savvy23 -

Actually, if you're not looking for too much customization, you can do it without a Lambda function! Just configure the setup from CloudWatch Logs to Firehose and then to S3. Here’s some documentation to get you started: https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CreateDestination.html

Answered By CloudWizard99 On

You can transfer EKS audit logs to an S3 bucket similarly to how you would send any CloudWatch log. Just set up the right configuration to pipe it over.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.