I'm curious about how we can incorporate AI and LLMs into our standard DevOps tools, like monitoring, telemetry, logging, and metrics. Is there a good way to start using LLMs or fine-tuning models in our day-to-day operations? I'm particularly interested in how to create wrappers or interfaces to facilitate this integration. Has anyone else gone through this process recently and can share their insights?
5 Answers
You'll want to set up an interface, probably in Python, to pull data from ChatGPT or other LLMs. Starting with a wrapper or some basic logic is key. That's what I did, beginning with creating metrics reports for a UI and it's been a game changer!
I've seen companies using AI to enhance security scanning in code repositories, like jit.io. That could be a solid direction to explore for integrating AI into your workflow!
I built a Kubernetes app that takes pod crash logs, analyzes them, and then posts summaries to Slack using AI prompts. It's been really effective and surprisingly fun! Just shows how cool it is to pipe logs into an API for insights.
Integrating AI isn't straightforward, but sometimes you just have to roll with what your boss wants! Even if it feels unnecessary, they might push for it just to stay trendy with AI.
I've been working on setting up K8sGPT, which connects to MCP servers. This helps the LLMs manage tasks like automatically creating/updating Jira tickets and triggering builds in CI/CD tools. We're also using LLMs to streamline writing scripts or IaC templates. It's super useful for analyzing security scan results too!
Haha, so true! Is there any practical way you think it could actually add value?