Currently, when I launch a task in spring cloud dataflow it starts a pod inside which the task and the inherent jobs run. That pod has a naming convention of task name followed by a random ID. I wanted to know if I can maybe map it to the task execution ID or Job Execution ID so that its easier for me to locate a pod in case a job fails and look at the logs.
In Spring Cloud Data Flow, this is the expected behavior.
The task execution ID and job execution ID are generated only after the task launch request is sent to the corresponding target deployment environment (local, CF, k8s).
Feel free to create a feature request (would be great if you have any other ideas) in SCDF Github and we can track it from there.