I want to create CI/CD for deploying pods on EKS using AzureDevOps. First I have installed kubectl and aws-iam-authenticator on my system(windows) and created a Kubernetes CLuster in AWS. To communicate with Cluster, I have updated kubeconfig using the following command aws eks --region us-east-2 update-kubeconfig --name cluster_name
. Now I have copied the updated config file and pasted in Azure DevOps New service Connection. while verifying the test connection, getting the below error.
No user credentials found for a cluster in KubeConfig content. Make sure that the credentials exist and try again.
Below are my kubeconfig
& aws-auth-cm.yaml
files.
apiVersion: v1
clusters:
- cluster:
certificate-authority-data: *********
server: https://*************.gr7.us-east-1.eks.amazonaws.com
name: arn:aws:eks:us-east-1:*********:cluster/testconnection
contexts:
- context:
cluster: arn:aws:eks:us-east-1:*********:cluster/testconnection
user: arn:aws:eks:us-east-1:********:cluster/testconnection
name: arn:aws:eks:us-east-1:**********:cluster/testconnection
current-context: arn:aws:eks:us-east-1:*******:cluster/testconnection
kind: Config
preferences: {}
users:
- name: arn:aws:eks:us-east-1:*********:cluster/testconnection
user:
exec:
apiVersion: client.authentication.k8s.io/v1alpha1
args:
- --region
- us-east-1
- eks
- get-token
- --cluster-name
- testconnection
command: aws
--- aws-auth-cm.yaml
apiVersion: v1
kind: ConfigMap
metadata:
name: aws-auth
namespace: kube-system
data:
mapRoles: |
- rolearn: arn:aws:iam::*********:role/eks_nodegroup
username: system:node:{{EC2PrivateDNSName}}
groups:
- system:bootstrappers
- system:nodes
So, Could anyone guide me in the configurations which I have missed to establish the connection for EKS to AzureDevOps?