UnknownHostException in spark cluster mode

4/24/2020

I am having a running standalone spark cluster (1 master and 5 workers) in kubernetes and I am doing a spark-submit in cluster mode. But I am getting UnknownHostException

Caused by: java.io.IOException: Failed to connect to sw-care-alice-staging-f4c869cc-4tkpb:46105 Caused by: java.net.UnknownHostException: sw-care-alice-staging-f4c869cc-4tkpb

Here is my spark-submit command

spark-2.4.3-bin-hadoop2.7/bin/spark-submit --class LogParser.LogBundleConfigFetcher --conf spark.submit.deployMode=cluster --conf spark.network.timeout=300 --conf spark.scheduler.mode=FAIR --conf spark.master=spark://sm-care-alice-staging:7077 --conf spark.executor.cores=5 --conf spark.executor.memory=20g --conf spark.dynamicAllocation.maxExecutors=3 --conf spark.driver.memory=16g --conf spark.dynamicAllocation.enabled=false --conf spark.cores.max=15 http://minio.platform.svc.cluster.local:9000/alice-care/staging/config/spark/aliceparser.jar 

NOTE: It works in client mode perfectly. But in cluster mode if I add the host entries of Spark-workers in /etc/host manually it works fine. I want to avoid the manual addition of host. Please suggest any approach for the same.

-- Sumit G
apache-spark
kubernetes

0 Answers