Receiving an Unauthorized error when trying to create a codepipeline on Cluster w/ AWS Kubernetes (permission error)

3/15/2020

I am experiencing an issue while I'm trying to create a pipeline connecting to my GitHub repository, using AWS Kubernetes cluster.

During the creation process of my AWS cluster, there are no issues, afterwards, I attach a role to it and want to create a stack. However, once I create the stack using a .yml file (will be shown below) as it builds, once it gets to the post-build section I receive an Unauthorized error on my cluster and the build fails.

I have tried to reinstall my awscli, insert more specific role permissions to the user and tried to associate the user to the role which handles it but had no luck. I have confirmed the cluster is running on the correct region which reflects the settings/code to no avail.

The error described during the post-build:

    [Container] 2020/03/10 15:00:56 Running command aws eks update-kubeconfig --name $EKS_CLUSTER_NAME
Added new context arn:aws:eks:eu-west-2:<acc-id>:cluster/simple-jwt-api to /root/.kube/config
[Container] 2020/03/10 15:00:57 Running command kubectl apply -f simple_jwt_api.yml
Error from server (Forbidden): error when retrieving current configuration of:
Resource: "/v1, Resource=services", GroupVersionKind: "/v1, Kind=Service"
Name: "simple-jwt-api", Namespace: "default"
Object: &{map["apiVersion":"v1" "kind":"Service" "metadata":map["annotations":map["kubectl.kubernetes.io/last-applied-configuration":""] "name":"simple-jwt-api" "namespace":"default"] "spec":map["ports":[map["port":'P' "targetPort":'\u1f90']] "selector":map["app":"simple-jwt-api"] "type":"LoadBalancer"]]}
from server for: "simple_jwt_api.yml": services "simple-jwt-api" is forbidden: User "build" cannot get resource "services" in API group "" in the namespace "default"
Error from server (Forbidden): error when retrieving current configuration of:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "simple-jwt-api", Namespace: "default"
Object: &{map["apiVersion":"apps/v1" "kind":"Deployment" "metadata":map["annotations":map["kubectl.kubernetes.io/last-applied-configuration":""] "name":"simple-jwt-api" "namespace":"default"] "spec":map["replicas":'\x03' "selector":map["matchLabels":map["app":"simple-jwt-api"]] "strategy":map["rollingUpdate":map["maxSurge":'\x02' "maxUnavailable":'\x02'] "type":"RollingUpdate"] "template":map["metadata":map["labels":map["app":"simple-jwt-api"]] "spec":map["containers":[map["image":"<acc-id>.dkr.ecr.eu-west-2.amazonaws.com/deplo-ecrdo-1m8nnevsfzyoj:Deploy-Flask-App-to-Kubernetes-Using-EKS.master..2020-03-10.14.59.13.fd6845ea" "name":"simple-jwt-api" "ports":[map["containerPort":'\u1f90']] "securityContext":map["allowPrivilegeEscalation":%!q(bool=false) "privileged":%!q(bool=false) "readOnlyRootFilesystem":%!q(bool=false)]]]]]]]}
from server for: "simple_jwt_api.yml": deployments.apps "simple-jwt-api" is forbidden: User "build" cannot get resource "deployments" in API group "apps" in the namespace "default"
[Container] 2020/03/10 15:00:58 Command did not exit successfully kubectl apply -f simple_jwt_api.yml exit status 1
[Container] 2020/03/10 15:00:58 Phase complete: POST_BUILD State: FAILED
[Container] 2020/03/10 15:00:58 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: kubectl apply -f simple_jwt_api.yml. Reason: exit status 1
[Container] 2020/03/10 15:00:58 Expanding base directory path: .
[Container] 2020/03/10 15:00:58 Assembling file list
[Container] 2020/03/10 15:00:58 Expanding .
[Container] 2020/03/10 15:00:58 Expanding file paths for base directory .
[Container] 2020/03/10 15:00:58 Assembling file list
[Container] 2020/03/10 15:00:58 Expanding build.json
[Container] 2020/03/10 15:00:58 Skipping invalid file path build.json
[Container] 2020/03/10 15:00:58 Phase complete: UPLOAD_ARTIFACTS State: FAILED
[Container] 2020/03/10 15:00:58 Phase context status code: CLIENT_ERROR Message: no matching artifact paths found

I would like to add that my .kube/config file had an env set to null for some reason, I had an aws-authorization file which should have authorized the build and dealt with the configuration, associated acc, user, GitHub user, role for eksctl.

I apologize for my vagueness, I am quite new in programming, I was curious for anyone to have an idea on what causes my build to fail so I can look at the right direction to fix the issue.

Thank you for your time and patience.

-L

-- Omer Lewis
amazon-web-services
api
kubernetes
permissions

0 Answers