I am facing an issue while reading a file stored in my system in spark cluster mode program.It is giving me an error that "File not found" but file is present at defined location.Please suggest me some idea so that i can read local file in spark cluster using kubernetes.
You cannot refer local files on your machine when you submit Spark on Kubernetes.
The available solutions for your case might be:
local:///path/to/your-file.jar
If you are running local Kubernetes cluster like Minikube you can also create a Kubernetes Volume with files you are interested in and mount it to the Spark Pods: refer docs. Be sure to mount that volume to both Driver and Executors.