Take into account I’ve SCDF (Spring cloud dataflow 2.5.1) deployed at native and I’ve created 7 duties named – composed-task-runner and task2, task3 and so on.. I’ve created a composed process which name duties one after one other primarily based on preferences set.
I get logs from SCDF in temp listing and all different duties logs in identical temp listing however in numerous folders however
- I want to combination all of the logs from every process together with SCDF logs in a single folder/file. How
can we do that? I attempted setting LOG_FILE and LOG_LOCATION as args whereas beginning the SCDF however
could not assist.
- Can now we have a method to generate the traceId from SCDF that may be related to every log message,
generated from SCDF in addition to all of the duties which composes to have a job(utilizing composed process
- How can we get the values of area set by SCDF in TASK_EXECUTION desk like parent_task_id and
external_task_id in all of the duties?
- Can we cross some information from one process (task1) to a different process (task2) as soon as the duty completes
(task1)its execution and let subsequent process (task2) to begin?
- Can we cross all of the logs from duties in addition to from SCDF to Kafka Cloud utilizing KafkaAppender?