Help! Worker Nodes Can’t Connect to AKS Private Endpoint

0
3
Asked By CuriousCoder123 On

I'm setting up a private Azure Kubernetes Service (AKS) cluster, and while everything seems to be configured correctly, I'm stuck with a problem. The AKS cluster is registering properly in the private DNS zone, and the default node pool is created in the right subnet. I can even ping the API hostname and see the Private Endpoint's IP.

However, I'm facing a major issue because the worker nodes aren't able to connect to the cluster through the Private Endpoint. They can communicate with each other, but the Private Endpoint does not respond to pings or HTTPS requests, despite being in the same subnet as the worker nodes.

I've tried creating the AKS cluster both with Terraform and with Azure CLI using the script provided. We've consulted with Microsoft support, but no solution has emerged so far. What should I check next or what might I have overlooked?

2 Answers

Answered By TechieTim83 On

Have you checked the Network Security Group (NSG) settings? Make sure that required outbound rules for the nodes are configured correctly on your firewall. Sometimes, traffic restrictions can block the nodes from accessing the API endpoint.

Answered By AzureGuru77 On

Is your Private DNS zone properly linked to the VNet where the nodes are located? Even if DNS resolution works in a centralized setup, the AKS private DNS zone needs to be connected to the VNet with the AKS nodes to successfully connect to the Kubernetes API.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.