I have a Spark Operator deployed to GKE using the Helm Chart. And there is a spark-pi.yaml
that I've loaded to Google Storage. When I execute kubectl apply -f https://storage.cloud.google.com/my-bucket/spark-pi.yaml
I'm getting the following error:
error converting YAML to JSON: yaml: line 11: mapping values are not allowed in this context
I've checked the file contents with an online YAML validator and it appears to be valid. I took the configuration from GoogleCloudPlatform/spark-on-k8s-operator examples.
The mainClass
on line 11 is exactly the same as in the User guide.
What may be wrong with the file and how can I fix it?
spark-pi.yaml
apiVersion: "sparkoperator.k8s.io/v1beta2"
kind: SparkApplication
metadata:
name: spark-pi
namespace: default
spec:
type: Scala
mode: cluster
image: "gcr.io/spark-operator/spark:v2.4.5"
imagePullPolicy: Always
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar"
sparkVersion: "2.4.5"
restartPolicy:
type: Never
volumes:
- name: "test-volume"
hostPath:
path: "/tmp"
type: Directory
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
labels:
version: 2.4.5
serviceAccount: default
volumeMounts:
- name: "test-volume"
mountPath: "/tmp"
executor:
cores: 1
instances: 1
memory: "512m"
labels:
version: 2.4.5
volumeMounts:
- name: "test-volume"
mountPath: "/tmp"