io.fabric8.kubernetes.client.KubernetesClientException Message: unable to parse requirement: invalid label value

4/16/2019

Not able to figure out why I get an invalid label value error when deploying my spark job through kubernetes using spark-submit. The log error below it stating that it sees my class name appended with a dollar sign but I don't have anything that isn't alphanumeric in my class name.

io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: GET at: http://<...>/labelSelector=spark-app-selector%3Dcom.ibm.ai.admin.SparkPiAdmin$,spark-role%3Dexecutor. Message: unable to parse requirement: invalid label value: "com.admin.SparkPiAdmin
quot;
: a valid label must be an empty string or consist of alphanumeric characters, '-', '_' or '.', and must start and end with an alphanumeric character (e.g. 'MyValue', or 'my_value', or '12345', regex used for validation is '(([A-Za-z0-9][-A-Za-z0-9_.]*)?[A-Za-z0-9])?'). Received status: Status(apiVersion=v1, code=400, details=null, kind=Status, message=unable to parse requirement: invalid label value: "com.admin.SparkPi
quot;
: a valid label must be an empty string or consist of alphanumeric characters, '-', '_' or '.', and must start and end with an alphanumeric character (e.g. 'MyValue', or 'my_value', or '12345', regex used for validation is '(([A-Za-z0-9][-A-Za-z0-9_.]*)?[A-Za-z0-9])?'), metadata=ListMeta(_continue=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=BadRequest, status=Failure, additionalProperties={}).
-- horatio1701d
apache-spark
kubernetes

1 Answer

4/16/2019

Because there is a $ sign in your label com.admin.SparkPi$. You need to get rid of it.

Probably your app is giving labels based on some name, variable, or any other parameter. May be in other context this is not a problem, but in kubernetes it is.

-- suren
Source: StackOverflow