Hey folks! I'm currently tackling the challenge of centralizing logging for our infrastructure on Google Cloud Platform (GCP). We've got Cloud Logging set up, but our storage costs are racking up at about $0.50 per GB. I was brainstorming a more budget-friendly approach, and here's what I came up with:
- Set up a sink to export logs to Google Cloud Storage (GCS).
- Enable Autoclass on the bucket to help optimize storage fees over time.
- Regularly import logs into BigQuery so we can analyze and visualize them using Grafana.
As someone who's still pretty new to this, I'm eager to find a solution that maintains both functionality and long-term cost efficiency. Does anyone think this is a solid plan? Or do you have any better ideas or best practices?
1 Answer
If you're using Grafana, you might want to consider incorporating Loki for your logs. It can directly support Google Cloud Storage and could streamline your process. Check out Grafana's documentation on Loki for these compatible setups!
So just to clarify, are you suggesting to move logs from Cloud Logging to GCS and then use Loki to visualize in Grafana? I thought Loki was for aggregating logs from multiple sources; do I really need it with GCP already managing my logs?
That's interesting! I've seen an alternative flow where you send logs from Cloud Logging to Pub/Sub, then use Fluent Bit to transport them into Loki with GCS as the backend. It could potentially streamline your logging even more!