Best Ways to Capture Huge Memory Dumps from Windows Pods in AKS

0
6
Asked By TechyNinja84 On

I'm looking for advice on how to effectively capture and store very large memory crash dumps (over 100GB) from a Windows pod running in Azure Kubernetes Service (AKS) after a crash. It's crucial for me to ensure these dumps are saved without any corruption and are accessible for later download or inspection. I've tried using a premium Azure disk (az-disk) but it hasn't been reliable for this situation. I'm also considering options like emptyDir, although I haven't tested that yet. Any suggestions would be greatly appreciated! Thanks!

3 Answers

Answered By CuriousCoder55 On

What exactly is the use case you're looking at? If you're trying to debug something specific, knowing more about the application could help narrow down some solutions. Dealing with memory dumps can be tricky, especially in containers—so knowing the context is key!

TechyNinja84 -

I'm focused on debugging my app which generates these large memory dumps. Just wish it didn't have to be a Windows container, they're such a pain to work with!

Answered By LaughingLoader29 On

You might want to look into different methods for storing those large dumps since the typical Azure disk approach isn't cutting it for you. I've heard good things about using Azure Files for larger data sets, which could help in retaining the integrity of your dumps. Just make sure you're considering performance, as well! Good luck!

Answered By DebuggingQueen72 On

Using windows pods definitely requires a different approach for handling those massive dumps. If you're consistently getting large dumps, it might be time to refactor your application for better memory management. It could help reduce how often you encounter these crashes in the first place!

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.