unable to mount volume to spark.kubernetes.executor

11/27/2019

I am trying to read a file from server in spark cluster mode using kubernetes, so i put my file on all workers and i mount driver volume using
val conf = new SparkConf().setAppName("sparksetuptest") .set("spark.kubernetes.driver.volumes.hostPath.host.mount.path", "/file-directory")

Everything works fine here but when i execute it shows that file not found at specific location. So i mount directory to executor with .set("spark.kubernetes.executor.volumes.hostPath.host.mount.path", "/file-directory") But now i am not able to execute program it stuck in a never ending process while fetching data.

Please suggest something, so that i can mount my directory with executor and read that file.

-- harshit saxena
apache-spark
data-science
kubernetes
scala
server

0 Answers