Are DevOps Engineers Integrating AI/ML Workloads in Their Work?

0
7
Asked By CleverFox27 On

I'm curious if anyone here is involved in deploying AI or machine learning workloads as part of their DevOps practices. What tools or projects are you working on, and how are you improving your skills in AI and ML?

5 Answers

Answered By TechieTrendz On

I've been focusing on building Kubeflow workflows based on Jupyter notebooks to streamline model development and deployment. It's been quite interesting, but I feel like it’s just the tip of the iceberg when it comes to MLOps.

Answered By CloudWizard88 On

AI and ML have started helping me with learning a new tool for detecting IaC drift in Terraform. It learns patterns and provides suggestions, which is pretty neat. The best part? It's free!

Answered By DataDynamo_01 On

We're currently using AI for monitoring and alerting at Okahu, specifically for anomaly detection in logs and predictive scaling. I’ve learned a lot from Andrew Ng's course, but honestly, most of my learning comes from trial and error during production fixes. I’ve also been experimenting with LLMs to generate Terraform configs from plain text, and it works well about 60% of the time, which isn’t terrible!

Answered By AI_Enthusiast42 On

I've done some testing with AI tools aimed at students, and I've heard that enterprise-level tools are even more impressive. There's a lot of online resources about deploying AI agents locally. I enjoy the process of system design along with reviewing code and logic.

Answered By ML_Explorer On

Absolutely! Many of us in DevOps are now rolling out AI and ML workloads. A lot of the upskilling happens through hands-on projects and learning MLOps fundamentals, like Docker, Kubernetes, MLflow, and Kubeflow, rather than diving deep into ML theory.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.