Consider I have SCDF (Spring cloud dataflow 2.5.1) deployed at local and I have created 7 tasks named - composed-task-runner and task2, task3 etc.. I have created a composed task which call tasks one after another based on preferences set. I am getting logs from SCDF in temp directory and all other tasks logs in same temp directory but in different folders but
- I would like to aggregate all the logs from each task along with SCDF logs in one folder/file. How can we do this? I tried setting LOG_FILE and LOG_LOCATION as args while starting the SCDF but couldn't help.
- Can we have a way to generate the traceId from SCDF that can be associated with each log message, generated from SCDF as well as all the tasks which composes to have a job(using composed task runner).
- How can we get the values of field set by SCDF in TASK_EXECUTION table like parent_task_id and external_task_id in all the tasks?
- Can we pass some data from one task (task1) to another task (task2) once the task completes (task1)its execution and let next task (task2) to start?
- Can we pass all the logs from tasks as well as from SCDF to Kafka Cloud using KafkaAppender?
Please advise.