SAS Data Integration Studio, DataFlux Data Management Studio, SAS/ACCESS, SAS Data Loader for Hadoop and others

Send output of one job as input to other

Reply
Frequent Contributor
Posts: 85

Send output of one job as input to other

Hi all,

I am creating two DI ETL jobs.

My first job is audit job which will create process_date as macro variable which I will be passing as input to next extract job which would extract data based upon process date macro variable.

Has anyone encountered situation like this? Need guidance in passing variable as input to others

Super User
Posts: 17,963

Re: Send output of one job as input to other

If your process_date is a Global macro variable not sure there's an issue. Though I'd be careful with my naming and making sure it was a proper SAS date.

%include 'audit1.sas'; *Generates macro variable proces_date which has a scope of global.

%second_audit(run_date=&process_date);

Super User
Posts: 5,260

Re: Send output of one job as input to other

It's not common that DI jobs are tied together by a self written SAS "script" using %includes.

A design question, audit job, is the creation of a macro variable the only task? Will this be repeated to be used as input to other extract jobs?

If not, I suggest that you merge the two jobs into one.

Another DI way is try to have the audit job as an outer job, and you try to pass the macro variable as a parameter to the extract job.

Data never sleeps
Respected Advisor
Posts: 3,908

Re: Send output of one job as input to other

You could as Linus suggests have your audit job as outer job calling your second job as inner job. This way you could pass a macro variable.

I assume your "audit job" will maintain some kind of control table. If yes then the way I've seen this done is to implement this as a user transformation. You then simply use this user transformation in your jobs (passing the job name as parameter with is the key to your control table).

If this is what you intend to do: In order to avoid table locking issues with your control table make sure that you either run your jobs in sequence or that the control table is under SAS/Share or in a database where parallel read/write is possible.

Ask a Question
Discussion stats
  • 3 replies
  • 366 views
  • 0 likes
  • 4 in conversation