Good morning!
I'm running a job (J_MAIN) within DI Studio that has 3 jobs within it sequentially (J1 -> J2 -> J3).
I noticed that when J1 finished and J2 is running, the J1 work files are not erased.
Does anyone know if there is some kind of setup for after the successful execution of J1, be cleaned the work files used in the J1 before starting automatically J2?
Hello,
You can use a PROC DELETE LIB=WORK KILL;
This will delete everything in the WORK Dataset.
Wayne
Is there any reason you need the J1 files to be deleted? If you don't do anything, they'll be scratched when the session ends.
Yes.
In fact, my job J_MAIN has 100 JOBS and so is bursting the work space before it finished completely.
Fair enough...I've had situations like that too.
In that case, Wayne's suggestion should work fine.
If I understand this right then you've created 3 jobs (J1 to J3) and then a 4th job (J_MAIN) where you just dropped the 3 job objects into. This might look to you like a master job executing 3 child jobs but on a code level this is simply one big job containing all the code from J1, J2 and J3. And as this is one big single job it is also run this way.
So what @WayneBell suggests will work as it simply cleans out the workspace - but it will also clean out the workspace of "J_MAIN" as this is the same.
The clean way of running your 3 jobs in sequence would be to use a scheduler like LSF.
Just to link separate jobs together and check for successful completion between them, you don't even need a fancy scheduler. crontab or Windows Task would be enough. Just build your flows in Management Console.
Join us for SAS Innovate 2025, our biggest and most exciting global event of the year, in Orlando, FL, from May 6-9. Sign up by March 14 for just $795.
Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.
Find more tutorials on the SAS Users YouTube channel.