Kubernetes Spark Cores Allocation issue

11/13/2019

Spark Allocates Cores it is not fully utilize them,as far as Kubernetes will be concerned, the AgentPool in the Cluster will have plenty of CPU Resources to do its work,Spark will be concerned, there will be 0 available cores to do work. If there are 0 Cores available for Work, then Spark will send Drivers and Applications into a WAITING status. So Spark will think it doesnt have the Resources to do any work, and KUbernetes will think there are plenty of Resources to do the work. what will be solution for this issue.

-- Kaavhrit G
apache-spark
kubernetes

0 Answers