Azure: Unable to connect to cluster (Aks-engine) using kubectl

2/2/2019

SOLUTION

I appended feature-gates in kube-apiserver.yaml in master node. This broke the apiserver, so kubectl couldn't connect to the nodes. After removing them, it was working fine.


PROBLEM

I deployed a Kubernetes cluster using aks-engine but I'm getting this error Unable to connect to the server: dial tcp 13.66.162.75:443: i/o timeout when I try to use kubectl. I'm able to access the master node with the serial console but not through ssh (same error comes in this case).

$ KUBECONFIG=_output/kubeconfig/kubeconfig.westus2.json kubectl get node
Unable to connect to the server: dial tcp 13.66.162.75:443: i/o timeout


$ KUBECONFIG=_output/kubeconfig/kubeconfig.westus2.json kubectl version
    Client Version: version.Info{Major:"1", Minor:"8", GitVersion:"v1.8.6", GitCommit:"6260bb08c46c31eea6cb538b34a9ceb3e406689c", GitTreeState:"clean", BuildDate:"2017-12-21T06:34:11Z", GoVersion:"go1.8.3", Compiler:"gc", Platform:"linux/amd64"}

Aks-Engine version - v0.28.1-linux-amd64

Kubernetes version - 1.10.12

Here is the kubeconfig.westus2.json file -

  {
        "apiVersion": "v1",
        "clusters": [
            {
                "cluster": {
                    "certificate-authority-data": "*****"
                    "server": "https://masquerade-az.westus2.cloudapp.azure.com"
                },
                "name": "masquerade-az"
            }
        ],
        "contexts": [
            {
                "context": {
                    "cluster": "masquerade-az",
                    "user": "masquerade-az-admin"
                },
                "name": "masquerade-az"
            }
        ],
        "current-context": "masquerade-az",
        "kind": "Config",
        "users": [
            {
                "name": "masquerade-az-admin",
                "user": {"client-certificate-data":"****","client-key-data":"*****"}
            }
        ]
    }

This is the screenshots for inbound ports.

This is the screenshot for outbound ports.

-- Masquerade0097
azure
azure-aks
kubectl
kubernetes

1 Answer

2/10/2019

As shared by the original poster, the solution is:

I appended feature-gates in kube-apiserver.yaml in master node. This broke the apiserver, so kubectl couldn't connect to the nodes. After removing them, it was working fine.

-- Karishma Tiwari - MSFT
Source: StackOverflow