I have deployed my code on docker container when I try to create spark session (and spark is also install on kubernetes) using pyspark. I am getting below error:
ERROR:root:Exception while sending command.
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/py4j/java_gateway.py", line 1207, in send_command
raise Py4JNetworkError("Answer from Java side is empty")
py4j.protocol.Py4JNetworkError: Answer from Java side is empty
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/py4j/java_gateway.py", line 1033, in send_command
response = connection.send_command(command)
File "/usr/local/lib/python3.7/site-packages/py4j/java_gateway.py", line 1212, in send_command
"Error while receiving", e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving
My pyspark code :
spark = SparkSession \
.builder \
.appName("Segments") \
.master("spark://xx.xxx.x.xxx:7077") \
.config("spark.executor.memory", "14g") \
.config("spark.cores.max", 1) \
.config("spark.driver.memory", "20g") \
.config("spark.network.timeout", 100000000) \
.config("spark.driver.extraClassPath", "/app/jars/mysql-connector-java-8.0.19.jar") \
.config("spark.executor.extraClassPath", "/app/jars/mysql-connector-java-8.0.19.jar") \
.getOrCreate()
Here xx.xxx.x.xxx is my spark kubernetes server. Anyone have any idea about why this error is coming or there are any other method to create spark session on docker with spark kubernetes.