kubectl version
on CodeBuild prints error...
[Container] 2019/08/26 04:07:32 Running command kubectl version
Client Version: version.Info{Major:"1", Minor:"15", GitVersion:"v1.15.0", GitCommit:"e8462b5b5dc2584fdcd18e6bcfe9f1e4d970a529", GitTreeState:"clean", BuildDate:"2019-06-19T16:40:16Z", GoVersion:"go1.12.5", Compiler:"gc", Platform:"linux/amd64"}
error: You must be logged in to the server (the server has asked for the client to provide credentials)
error: You must be logged in to the server (the server has asked for the client to provide credentials)
I'm using Amazon EKS Cluster. It seems some authentication setup missing...?
What I did:
codebuild-hoge-service-role
is created).eks:DescribeCluster
Policy to the role as inline policy because aws eks update-kubeconfig
requires it.configmap/aws-auth
to bind the role and RBAC by kubectl edit -n kube-system configmap/aws-auth
on my local device, adding new config to mapRoles
like: mapRoles: |
- rolearn: .....
- rolearn: arn:aws:iam::999999999999:role/service-role/codebuild-hoge-service-role
¦ username: codebuild
¦ groups:
¦ - system:masters
That's all. Not enough? Is there anything I missed?
Also I tried another approach to debug and it worked successfully..
configmap/aws-auth
and Add config for the role. (same as failure process)kubectl version
. It worked!!buildspec.yml
version: 0.2
phases:
install:
runtime-versions:
docker: 18
commands:
- curl -LO https://storage.googleapis.com/kubernetes-release/release/v1.15.0/bin/linux/amd64/kubectl
- chmod +x ./kubectl
- mv -f ./kubectl /usr/local/bin/kubectl
pre_build:
commands:
- aws eks update-kubeconfig --name mycluster
- kubectl version
build:
commands:
- kubectl get svc -A