I'm wondering if someone can help - or can point out I'm trying to do something that is impossible!
We currently have a Process Flow that contains around 130 individual jobs with multiple inter dependencies. The flow is becoming incredibly slow to edit and difficult to maintain.
I was looking at splitting it out into several sub flows to make it easier to handle but I'm unable to do this because of the dependencies between jobs.
The best solution I can come up with is to make use of trigger files so when a job successfully completes it places a file on the server and the dependent jobs in the subflows can then kick off when the file arrives. However I don't want to have to update and deploy 130+ jobs.
Is it possible to create some SAS code that could pick up the name of the flow job and create a trigger file with that name that could be wrapped up as a reusable job?
So instead of:
I would have
I'd then be able to split out the flow (which will be painful enough) into multiples and drop the new code in wherever there is a dependency to create the trigger file and the dependent job then just has a file trigger dependency.
Hopefully that makes sense! Thanks for any help offered.
Hello rc7782, I would recommend opening a Technical Support track and I work directly with you on this issue. I would like to dig a deeper into this question. You can mention my name when you open the track. Thanks Bob Maggio
Just had a thought about this - there is a file that sits behind LSF called lsb.events that contains details about each job as well as their exit status.
You could write some code that would periodically read this file and then write out .ok files for any or all jobs that complete. These ok files could then be used in other flows. Once all flows are completed you would need another job that would clean up all the .ok files that got created so it wouldn't cause problems when the flow was executed again the following day.