BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
Attyslogin
Obsidian | Level 7

I have follwing things to do in my project.

1] Import large csv files 

2] two extractions.

3] Data processing to generate report.

So in order to have better arrangement i have created three separate jobs for each of my activity listed above.

 

1] Import large csv files   [Import Job]

2] two extractions. [Extraction Job]

3] Data processing to generate report. [Data Processing Job]


Eventually A master job i have created which will be comprised of these three jobs connected one after other for an execution.

Challanges for me.

1] I cant access macro variables created in data step using call symput from one job to another although they are global in nature.

 

2] When i run master job i can see comrpised three jobs run one after another perfectly with no error but indeed running them separately shows me an error.

So please let me know 1] how to share macro variables created in one job to another 2] why my master job shows me successful execution though comprised jobs in it has error.

kindly help.  THanks 

1 ACCEPTED SOLUTION

Accepted Solutions
Patrick
Opal | Level 21

I like your design approach to keep things simple and create multiple jobs which only do 1 thing each. That makes job maintenance and debugging so much easier.

 

Now for your questions

Each DIS job runs in its own session with its own workspace. Macro variables only exist within the scope of a session. The scope of global for a macro variable is relevant within the same session (global: the macro variable exists outside of the macro where it has been created so the scope is global within the SAS session).

 

So now if you run your DIS jobs one by one each job has its own session and though job 2 doesn't have access to a macro variable created in job 1.

 

For your master DIS job: Because you drag all the other job metadata objects into the canvas of the master job what you're actually doing is creating one big job (the master) put together of all the "sub" jobs. So all the code of the master now runs together in a single session as a single job and therefore global macro variables are available for all parts.

 

If you want to exchange data between jobs then yes, store it in a permanent table which job 1 populates and job 2 then queries.

 

In your specific case what you could do instead to allow for testing of individual "sub-" jobs, is to have the following code in the pre-code of each job:

%macro init_mac_var(macvar,macvalue);
  /* create and populate macro variable if it doesn't exist */
  %if  not %symexist(&macvar) %then
    %do;
      %global &macvar;
      %let &&macvar=&macvalue;
    %end;
%mend;

%init_mac_var(myvar,value for unit testing);
%init_mac_var(myvar2,value for unit testing);

Above code will allow you to unit test your jobs individually but when you run the code alltogether via master job then the macro variable definitions set by the code snippet of the sub-job 1 won't get overwritten by downstream sub-jobs.

View solution in original post

3 REPLIES 3
Astounding
PROC Star

If you don't find a suitable solution for passing macro variables, it's easy enough to work around the problem.  Instead of trying to save macro variables, save a SAS data set with two variables:

 

name $ 32 = name of the macro variable

value $ 2000 = value assigned to the macro variable

 

Add an observation for each macro variable.

 

Assuming that 2000 is enough characters to hold your macro variable, you can add a step in all programs to regenerate the macro variables:

 

data _null_;

set have;

call symputx(name, value);

run;

 

Depending on the circumstances, you might even consider saving the SAS data set permanently to serve as documentation for parameters used in the analysis.

Patrick
Opal | Level 21

I like your design approach to keep things simple and create multiple jobs which only do 1 thing each. That makes job maintenance and debugging so much easier.

 

Now for your questions

Each DIS job runs in its own session with its own workspace. Macro variables only exist within the scope of a session. The scope of global for a macro variable is relevant within the same session (global: the macro variable exists outside of the macro where it has been created so the scope is global within the SAS session).

 

So now if you run your DIS jobs one by one each job has its own session and though job 2 doesn't have access to a macro variable created in job 1.

 

For your master DIS job: Because you drag all the other job metadata objects into the canvas of the master job what you're actually doing is creating one big job (the master) put together of all the "sub" jobs. So all the code of the master now runs together in a single session as a single job and therefore global macro variables are available for all parts.

 

If you want to exchange data between jobs then yes, store it in a permanent table which job 1 populates and job 2 then queries.

 

In your specific case what you could do instead to allow for testing of individual "sub-" jobs, is to have the following code in the pre-code of each job:

%macro init_mac_var(macvar,macvalue);
  /* create and populate macro variable if it doesn't exist */
  %if  not %symexist(&macvar) %then
    %do;
      %global &macvar;
      %let &&macvar=&macvalue;
    %end;
%mend;

%init_mac_var(myvar,value for unit testing);
%init_mac_var(myvar2,value for unit testing);

Above code will allow you to unit test your jobs individually but when you run the code alltogether via master job then the macro variable definitions set by the code snippet of the sub-job 1 won't get overwritten by downstream sub-jobs.

Attyslogin
Obsidian | Level 7

Perfect Patrick... Cheers Man.

 

What solution with i ended up is i created permanent datset of macro variables along with its values and simply i used them wherever i was requiring them. for testing jobs individually. 

The question regarding how my master job was running perfectly though individual job was having no access to each othert macros your information regarding jobs session and workspace was really helpful. Thanks

sas-innovate-2024.png

Don't miss out on SAS Innovate - Register now for the FREE Livestream!

Can't make it to Vegas? No problem! Watch our general sessions LIVE or on-demand starting April 17th. Hear from SAS execs, best-selling author Adam Grant, Hot Ones host Sean Evans, top tech journalist Kara Swisher, AI expert Cassie Kozyrkov, and the mind-blowing dance crew iLuminate! Plus, get access to over 20 breakout sessions.

 

Register now!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 3 replies
  • 2908 views
  • 1 like
  • 3 in conversation