SAS Data Integration Studio, DataFlux Data Management Studio, SAS/ACCESS, SAS Data Loader for Hadoop and others

Work Files - Sequential JOB

Reply
Contributor
Posts: 58

Work Files - Sequential JOB

Good morning!

I'm running a job (J_MAIN) within DI Studio that has 3 jobs within it sequentially (J1 -> J2 -> J3).

I noticed that when J1 finished and J2 is running, the J1 work files are not erased.

Does anyone know if there is some kind of setup for after the successful execution of J1, be cleaned the work files used in the J1 before starting automatically J2?

Occasional Contributor
Posts: 6

Re: Work Files - Sequential JOB

Posted in reply to DavidCaliman

Hello,

You can use a PROC DELETE LIB=WORK KILL;

This will delete everything in the WORK Dataset.

Wayne

PROC Star
Posts: 1,167

Re: Work Files - Sequential JOB

Posted in reply to DavidCaliman

Is there any reason you need the J1 files to be deleted? If you don't do anything, they'll be scratched when the session ends.

Contributor
Posts: 58

Re: Work Files - Sequential JOB

Yes.

In fact, my job J_MAIN has 100 JOBS and so is bursting the work space before it finished completely.

PROC Star
Posts: 1,167

Re: Work Files - Sequential JOB

Posted in reply to DavidCaliman

Fair enough...I've had situations like that too.
In that case, Wayne's suggestion should work fine.

Respected Advisor
Posts: 4,173

Re: Work Files - Sequential JOB

Posted in reply to DavidCaliman

If I understand this right then you've created 3 jobs (J1 to J3) and then a 4th job (J_MAIN) where you just dropped the 3 job objects into. This might look to you like a master job executing 3 child jobs but on a code level this is simply one big job containing all the code from J1, J2 and J3. And as this is one big single job it is also run this way.

So what @WayneBell suggests will work as it simply cleans out the workspace - but it will also clean out the workspace of "J_MAIN" as this is the same.

The clean way of running your 3 jobs in sequence would be to use a scheduler like LSF.


Super User
Posts: 5,441

Re: Work Files - Sequential JOB

Just to link separate jobs together and check for successful completion between them, you don't even need a fancy scheduler. crontab or Windows Task would be enough. Just build your flows in Management Console.

Data never sleeps
Ask a Question
Discussion stats
  • 6 replies
  • 477 views
  • 0 likes
  • 5 in conversation