BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
gyambqt
Obsidian | Level 7
I have a single flow with many DI jobs inside such as flow A has job1, job2.... to job100. I want to add an extra DI job at the start of the flow to check existence of some flat files. The flow is being scheduled to run on daily basis. If the flat source file does exist then I want to kick off the flow1 otherwise I want to abort the entire process without running any jobs in the flow. Should I use abort after checking the file existence? Will abort generate any error message? I do not want to see error message because it is normal we don’t get the source file on daily basis.
1 ACCEPTED SOLUTION

Accepted Solutions
Patrick
Opal | Level 21

If you use OS Scheduling then it will depend on your environment and the OS scheduler like cron or task manager what and how you need to implement. 

Patrick_1-1626649100292.png

 

If you follow the link from here for your OS you'll see that it's in the end a script that issues your .sas programs. I guess you could manually modify this script to check for existence of this external file and then only execute the other batch calls if the file exists (wrapped into some if..then...else logic). The check for the file would be using the scripting language like vbs or sh (whatever your OS is).

 

Your initial idea to have a DIS job checking for existence of the file and then eventually not execute the other jobs wouldn't work because the batch script will call each .sas file (DIS job) individually as a child process. The batch script is the parent so only if a .sas file child execution ends with errors passed back to the parent process the .sas jobs later down the track wouldn't get executed. But that would lead to your master script ending with error which is not really the right thing to do.

 

 

View solution in original post

8 REPLIES 8
Patrick
Opal | Level 21

If I understand this correctly then you've got a master DIS job and then dragged a lot of other DIS jobs into this job. If so then that's something I'd consider o.k. during development for "unit" testing but not as production worthy. 

In having such a master DIS job all your code will execute in a single SAS session (with risk of "overspill") and you also can't execute jobs in parallel unless you then also wrap a loop transform around such jobs.

 

What you describe should be done via a scheduler which can execute job flows conditionally.

 

IF really everything is in a single DIS job then you can always use the conditional execution transformation for what you describe.

 

gyambqt
Obsidian | Level 7
Hi what do you mean by DIS? The flow is create in SAS management console for scheduling. The flow can contain multiple DI jobs.
gyambqt
Obsidian | Level 7

I use operating system for scheduling.

Patrick
Opal | Level 21

If you use OS Scheduling then it will depend on your environment and the OS scheduler like cron or task manager what and how you need to implement. 

Patrick_1-1626649100292.png

 

If you follow the link from here for your OS you'll see that it's in the end a script that issues your .sas programs. I guess you could manually modify this script to check for existence of this external file and then only execute the other batch calls if the file exists (wrapped into some if..then...else logic). The check for the file would be using the scripting language like vbs or sh (whatever your OS is).

 

Your initial idea to have a DIS job checking for existence of the file and then eventually not execute the other jobs wouldn't work because the batch script will call each .sas file (DIS job) individually as a child process. The batch script is the parent so only if a .sas file child execution ends with errors passed back to the parent process the .sas jobs later down the track wouldn't get executed. But that would lead to your master script ending with error which is not really the right thing to do.

 

 

LinusH
Tourmaline | Level 20

At our site we have actually done this.

As @Patrick  suggests, we have used the scheduling plugin in SAs Management Console (SMC) to make job flows of individual SAS DI Studio deployed jobs.

At the beginning of some flows we have a SAS job that checks input file availability. If the file(s) is/are not there, we abort the job (set an RC > 1). If you have set up the dependencies correctly in SMC the job flow script will abort as well, hence preventing subsequent jobs to be triggered.

Data never sleeps
gyambqt
Obsidian | Level 7

the scheduler we have is via operating system. It does not let you set up checking file existence and trigger subsequent jobs.

gyambqt
Obsidian | Level 7

That is what I have done. Thanks!

SAS Innovate 2025: Call for Content

Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!

Submit your idea!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 8 replies
  • 1598 views
  • 0 likes
  • 3 in conversation