How to ssh to the node inside the cluster in local. I am using docker edge version which has kubernetes inbuilt. If i run kubectl ssh node I am getting
Error: unknown command "ssh" for "kubectl"
Did you mean this?
set
Run 'kubectl --help' for usage.
error: unknown command "ssh" for "kubectl"
Did you mean this?
set
You can effectively shell into a pod using exec
(I know its not exactly what the question asks, but might be helpful).
An example usage would be kubectl exec -it name-of-your-pod -- /bin/bash
. assuming you have bash installed.
Hope that helps.
How to ssh to the node inside the cluster in local
There is no "ssh" command in kubectl
yet, but there are plenty of options to access Kubernetes node shell.
In case you are using cloud provider, you are able to connect to nodes directly from instances management interface.
For example, in GCP: Select Menu
-> Compute Engine
-> VM instances
, then press SSH
button on the left side of the desired node instance.
In case of using local VM (VMWare, Virtualbox), you can configure sshd
before rolling out Kubernetes cluster, or use VM console, which is available from management GUI.
Vagrant provides its own command to access VMs - vagrant ssh
In case of using minikube, there is minikube ssh
command to connect to minikube VM. There are also other options.
I found no simple way to access docker-for-desktop
VM, but you can easily switch to minikube for experimenting with node settings.