Submit Spark job through "spark-on-k8s-operator" from inside a different pod

11/21/2019

Following this guide, I have deployed a "spark-on-k8s" operator inside my Kubernetes cluster.

In the guide, it mentions how you can deploy Spark applications using kubectl commands.

My question is whether it is possible to deploy Spark applications from inside a different pod instead of kubectl commands? Say, from some data pipeline applications such as Apache NiFi or Streamsets.

-- toerq
apache-spark
kubernetes
kubernetes-pod

1 Answer

11/21/2019

Yes, you can create pod from inside another pod.

All you need is to create a ServiceAcount with appropriate Role that will allow creating pods and assign it to the pod so then you can authenticate to kubernetes api server using rest api or one of k8s client libraries to create you pod.

Read more how to do it using kubernetes api in kubernetes documentation.

Also read here on how to create roles.

And take a look here for list of k8s client libraries.

Let me know if it was helpful.

-- HelloWorld
Source: StackOverflow