Scenario 1: I am deploying an spring boot app in a physical machine in which it opens ssh connection on a port of the physical machine. Lets say 9889. So once the application starts, that physical machine will open connection on that port and listen for any incoming request. Any client can connect to that port using command
ssh admin@ipaddress -p 9889.
It returns the connection i expect.
Scenario 2: I am deploying that app into Kubernete cluster. I set external Ip of the service to the IP on master node. So when i type Kubectl get services i got something like
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT
app-entry LOADBALANCER 172.x.x.xx 172.x.x.x 3000:3552
How can i ssh to the app in the Kubernete Cluster using the externalIP and the port 3000. Since everytime i try to ssh using command above it returns connection refused
As @zerkms mentioned in the comment, you can use kubectl exec
to connect the shell of a container running inside the Kubernetes cluster.
$ kubectl exec -it -n <namespace> <pod_name> -- /bin/bash
$ kubectl exec -it -n <namespace> <pod_name> -- /bin/sh
# if the pod has multiple containers
$ kubectl exec -it -n <namespace> <pod_name> -c <container_name> -- /bin/bash
If you have a running server on your pod which serves at a specific port, you can use kubectl port-forward
to connect it from the local machine (ie. localhost:PORT
).
# Pods
$ kubectl port-forward -n <namespace> pods/<pod_name> <local_port>:<container_port>
# Services
$ kubectl port-forward -n <namespace> svc/<service_name> <local_port>:<container_port>