io.fabric8.kubernetes.client.KubernetesClientException: Operation: [create] for kind: [Pod] with name: [null] in namespace: [spark-dev] failed

4/29/2020

I saw a strange issue while running spark kubernetes in scala. I am using below code to create 2 executors and its working fine on bare-metal but same code if i am running from inside kubernetes container its giving following error :

Code

val conf = new SparkConf().setAppName("Spark_test")
.setMaster("k8s://https://10.00.20.70:6443") 
.set("spark.kubernetes.container.image", "lumenore/spark:spark-2.4.4-bin-hadoop2.7")
.set("spark.submit.deployMode", "cluster")
.set("spark.kubernetes.authenticate.driver.serviceAccountName", "spark-dev")
.set("spark.kubernetes.namespace", "spark-dev")
.set("spark.driver.host", localIpAddress)
.set("spark.driver.bindAddress", localIpAddress)
.set("spark.kubernetes.node.selector.type", "spark")
.set("spark.sql.caseSensitive", "true")
.set("spark.sql.crossJoin.enabled", "true")
.set("spark.driver.allowMultipleContexts", "true")
.set("spark.driver.memory", "4g")
.set("spark.executor.memory", "4g")
.set("spark.executor.cores", "4") 
var sc = new SparkContext(conf) 
var spark = SparkSession.builder().appName("sparktest").config(conf).getOrCreate() 
var sqlContext = new SQLContext(sc)

Error

**io.fabric8.kubernetes.client.KubernetesClientException: Operation: [create] for kind: [Pod] with name: [null] in namespace: [spark-dev] failed.**
at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:64)
at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:72)
at io.fabric8.kubernetes.client.dsl.base.BaseOperation.create(BaseOperation.java:337) at io.fabric8.kubernetes.client.dsl.base.BaseOperation.create(BaseOperation.java:330) at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$onNewSnapshots$10(ExecutorPodsAllocator.scala:139)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:158)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.onNewSnapshots(ExecutorPodsAllocator.scala:126)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$start$1(ExecutorPodsAllocator.scala:68)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.$anonfun$start$1$adapted(ExecutorPodsAllocator.scala:68)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl.$anonfun$callSubscriber$1(ExecutorPodsSnapshotsStoreImpl.scala:102)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl.callSubscriber(ExecutorPodsSnapshotsStoreImpl.scala:99)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl.$anonfun$addSubscriber$1(ExecutorPodsSnapshotsStoreImpl.scala:71)
at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsSnapshotsStoreImpl$anon$1.run(ExecutorPodsSnapshotsStoreImpl.scala:107)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
**Caused by: java.net.SocketException: Broken pipe (Write failed)**
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431)
at sun.security.ssl.OutputRecord.write(OutputRecord.java:417)
at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:894)
at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:865)
at sun.security.ssl.AppOutputStream.write(AppOutputStream.java:123)
at okio.Okio$1.write(Okio.java:79)
at okio.AsyncTimeout$1.write(AsyncTimeout.java:180)
at okio.RealBufferedSink.flush(RealBufferedSink.java:224)
at okhttp3.internal.http2.Http2Writer.windowUpdate(Http2Writer.java:262)

Library Imported

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.5" 
libraryDependencies += "org.apache.spark" %% "spark-kubernetes" % "2.4.5" 
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.5"

Versions Spark : 2.4.4 Scala : 2.12.11 SBT : 1.3.10 Kubernetes: 1.17.3 Docker: 19.3.8

-- Shree Tiwari
apache-spark
kubernetes
scala

1 Answer

4/30/2020

Issue was with Java version, On Changing java version it worked well.

https://github.com/fabric8io/kubernetes-client/issues/2145

Try changing it to openjdk:8u252-jdk

-- Shree Tiwari
Source: StackOverflow