How to fix: pods "" is forbidden: User "system:anonymous" cannot watch resource "pods" in API group "" in the namespace "default"

5/24/2020

I am trying to run my spark over k8, I have set up my RBAC using the below commands:

kubectl create serviceaccount spark

kubectl create clusterrolebinding spark-role --clusterrole=edit --serviceaccount=default:spark --namespace=default

Spark command from outside of k8 cluster:

bin/spark-submit --master k8s://https://<master_ip>:6443  --deploy-mode cluster  --conf spark.kubernetes.authenticate.submission.caCertFile=/usr/local/spark/spark-2.4.5-bin-hadoop2.7/ca.crt --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark   --conf spark.kubernetes.container.image=bitnami/spark:latest test.py

error:

   Exception in thread "main" io.fabric8.kubernetes.client.KubernetesClientException: pods "test-py-1590306482639-driver" is forbidden: User "system:anonymous" cannot watch resource "pods" in API group "" in the namespace "default"
    at io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager$1.onFailure(WatchConnectionManager.java:206)
    at okhttp3.internal.ws.RealWebSocket.failWebSocket(RealWebSocket.java:571)
    at okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:198)
    at okhttp3.RealCall$AsyncCall.execute(RealCall.java:206)
    at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
    Suppressed: java.lang.Throwable: waiting here
        at io.fabric8.kubernetes.client.utils.Utils.waitUntilReady(Utils.java:134)
        at io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager.waitUntilReady(WatchConnectionManager.java:350)
        at io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:759)
        at io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:738)
        at io.fabric8.kubernetes.client.dsl.base.BaseOperation.watch(BaseOperation.java:69)
        at org.apache.spark.deploy.k8s.submit.Client$anonfun$run$1.apply(KubernetesClientApplication.scala:140)
        at org.apache.spark.deploy.k8s.submit.Client$anonfun$run$1.apply(KubernetesClientApplication.scala:140)
        at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2542)
        at org.apache.spark.deploy.k8s.submit.Client.run(KubernetesClientApplication.scala:140)
        at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication$anonfun$run$5.apply(KubernetesClientApplication.scala:250)
        at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication$anonfun$run$5.apply(KubernetesClientApplication.scala:241)
        at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2543)
        at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:241)
        at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:204)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:845)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$anon$2.doSubmit(SparkSubmit.scala:920)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
20/05/24 07:48:04 INFO ShutdownHookManager: Shutdown hook called
20/05/24 07:48:04 INFO ShutdownHookManager: Deleting directory /tmp/spark-f0eeb957-a02e-458f-8778-21fb2307cf42

Spark Docker images source --> docker pull bitnami/spark

I am also giving my crt file here present on the master of k8 cluster. I am trying to run spark-submit command from another GCP instance.

Can someone please help me here i am stuck with this since last couple of days.

Edit

I have created another clusterrole with cluster-admin permission but still it is not working

-- user7422128
apache-spark
google-cloud-platform
kubernetes

0 Answers