I'm a Software Engineer, not a DevOps specialist, and I've run into a wall while trying to set up log tracking for a Java-based internal application that's critical for banking communications. I was tasked with capturing and displaying logs in a searchable format by user, and I initially thought about using APIs, but my dev lead advised against that. Instead, I found Filebeat and the ELK stack (Elasticsearch, Logstash, Kibana) could fulfill the requirements. My app handles communications with nine different banks, generating logs in various formats, including XML and JSON, so I've worked on tagging messages for easier correlation.
After a rough start trying to set up a Dockerized ELK stack—which took me over a month and ended in persistent failures—I switched to setting up everything locally on Windows. Now, I have Filebeat working, but I'm hitting a wall with Logstash filters not parsing correctly, which stops data from reaching Elasticsearch. My dev lead has been disengaged throughout this process, and my manager labeled my outputs as poor despite the challenges. I'm committed to getting this done right but could really use some advice, especially on fixing those Logstash filters and ensuring data flows to Elasticsearch smoothly. I'm feeling burned out and undervalued, so any help would be greatly appreciated.
5 Answers
Have you considered only using VictoriaMetrics instead of ELK? It's simpler to set up and still gives you powerful search capabilities. The setup is straightforward, which may be beneficial to your tight timeline.
Just keep in mind that jumping to new software may have its own set of challenges, especially if you're dealing with strict security requirements in banking.
I'd suggest hitting Elastics's website for their free tutorials since they have a ton of resources that could guide you. Also, consider stepping away from Windows and back to a Linux/Docker setup—it really helps separate the software dependencies from the OS. The whole ELK setup is quite an involved project—it typically requires a team effort and could take a while to get right.
Honestly, I've got a tight deadline this Friday. I want to show something substantial to my manager, but I'm worried I won't complete the Docker setup in time.
Just make sure that you're testing everything in a dev environment first. It’ll save you a lot of headache later!
If your organization uses AWS, check out AWS OpenSearch. It can help streamline things and takes care of a lot of complexity with running your own Elastic cluster. Just be wary of the budget and obtain any necessary approvals from your security team.
Check out this article on deploying an ELK stack with Docker Compose. It’s a good starting point for what you're trying to achieve, especially for a small deployment. You'll want to pay attention to the docs for setting up Logstash pipelines and managing the input, filter, and output formats.
But you don't want Logstash running alongside your app. It’s better for it to process logs from multiple sources separately.
Yeah, I don't know if I'll make the deadline. I need to at least have something to show to my manager by then.
It sounds like you've been through a lot with this project! A straightforward fix could be to have Filebeat send logs directly to the ELK stack. Also, check out this example of a logstash pipeline I made ages ago. It's pretty outdated, but it might give you a head start: github.com/Gorton218/elk_demo. Just remember, the modern ELK stack tends to utilize Fleet and Elastic Pipelines instead of just Logstash, so it’s worth diving into that.
Seems like there are better options out there now for logging tasks. It’s worth considering, especially if you're feeling overwhelmed.