BookmarkSubscribeRSS Feed
Babloo
Rhodochrosite | Level 12

Could you please help me understand how to call the last dataset via SAS macro variable which was created in DI Job?

 

I've added the post code in the DI Job which calls .sas file with the use of %INCLUDE Statement. In that .sas file I'm trying to query the last dataset by using &SYSLAST. But it is not querying the dataset which I want to query and after executing of the DI Job which had post code. It says WORK.ETLS_CONTENTS dataset not exist. Instead of &SYSLAST. I tried with &etls_lastTable as well but it is not querying.

 

E.g. Assume I've the DI Job which has Table_A->User defined Custom Transformation->Table Loader Transformation->Table_B and I want to query Table_B in post code. Then how can I call Table_B in post code?

 

It seems &SYSLAST and &etls_lastTable not helps here. Any other macro variable which suits here?

14 REPLIES 14
Patrick
Opal | Level 21

Two things to address here:

 

1. Path names can change between environments (i.e. different Lev's). For this reason using %include is imho not a good idea unless you're using a macro variable for the path and you maintain and populate this macro variable in a single central place (i.e. the autoexec).

What I'm often doing: Wrap a macro around my code and then store the code in a folder which is part of the SAS Macro Autocall Facility. I then just call the macro instead of using %include.

 

2. Quite a few transformations let you chose if you want to populate &syslast or not. So it's not that hard to set-up a transformation which populates &syslast but it's not something you can rely on and just add some post code to many DIS jobs where you rely that the correct value will be in &syslast.

SAS creates an automatic variable _LAST_ which contains the name of the last table created. You could use this one BUT: Some DIS transformation also create code for some intermediary work tables so you can't rely that every transformation will have the target table as last target table - so _LAST_ could point to the wrong table.

Here an example how _LAST_ works:

data test;
  set sashelp.class;
run;

proc contents data=_last_;
run;

I don't really know what problem you're trying to solve but it feels that the approach you're taking here will work for defined cases but won't be suitable as a generic design approach.

Babloo
Rhodochrosite | Level 12
Well, what will be generic way to find the last dataset name via post code
of any DIS job?
Patrick
Opal | Level 21

 


@Babloo wrote:
Well, what will be generic way to find the last dataset name via post code
of any DIS job?

The last dataset will be accessible via _LAST_. My question is: Are you 100% sure that the last dataset is what you always need.

If you can't answer this question with yes then you need to fully re-think your design approach.

Patrick
Opal | Level 21

@Babloo 

BTW many transformations also create macro variables &_input and &_output and because the transformations don't clean-up after themselves you could use &_output.

Problem is: A transformation like the Splitter can have multiple target tables so which &_output<n> macro variable are you going to pick. 

Also: A transformation can have no target table and though &_output will have the value of the last transformation which had an output table.

 

As I don't know what problem you want to solve and why you need this last table I can't give you further guidance here how to address the problem.

Babloo
Rhodochrosite | Level 12
Assume I'm creating several datasets (both WORK and permanent but mostly
permanent) in DIS jobs and in all the final dataset of DIS jobs I have a
variable called ID which holds different values.

I have to get the unique ID of those tables and then apply the user defined
format for the ID variable and then create one WORK or permanent dataset
which holds only unique ID with formatted values.
Patrick
Opal | Level 21

"and in all the final dataset of DIS jobs I have a variable called ID which holds different values"

Does this table always have the same structure in all DIS jobs? Is it a table in WORK?

Babloo
Rhodochrosite | Level 12

Table structure differs but the variable ID is available in all the tables.

Patrick
Opal | Level 21

Do you have a metadata object for these tables in your DIS jobs (can be a green or yellow table object)?

Babloo
Rhodochrosite | Level 12

It's a registered SAS table.

 

BTW when I replace &SYSLAST with _LAST_ in my .sas file it still says

ERROR: File WORK.ETLS_CONTENTS.DATA does not exist.

However the final table which was created in DIS Job in ifr.ins_cash

Babloo
Rhodochrosite | Level 12

I see that following code in DIS Job. Seems loader Transformation is creating the code as below.

 

proc contents data = IFR.INS_CASH out = work.etls_contents(keep = nobs) noprint; 
run; 
LinusH
Tourmaline | Level 20

Not sur about your constraints, how many jobs will have this structure?

(source(s) - trransformation - target load - unique records + format)

 

I mean what is the use of the target table compared with your unique records table?

 

What stopping you to define this as extra steps using standard trnasformations?

Data never sleeps
Patrick
Opal | Level 21

@Babloo 

If you've got a table metadata object for the source table where you want to read the ID from then you can implement something clean and metadata driven.

 

Soo...

Instead of an %include create a Custom Transformation.

Capture.JPG

 

Have the code (the one you use currently in your %include) in the code tab of the transformation.

 

In the code use &_input as the name of your source table, use &_output as the name of your target table.

 

Define the transformation as having one input and one output

Capture.JPG

The "Generate column mapping macros" would allow you to write code in the code section which actually processes what you set up in the mapping tab when using the custom transformation.

 

Now when having clicked through the wizard your custom transformation is ready.

 

Drag it onto a DIS job as last transformation in the flow, connect the source table as input and the target table as output. Now look into the generated code.

 

....and as a tip during development: Keep your code outside of the transformation. Just add an %include to your code in the code section of the custom transformation. Only copy/paste the code into the transformation once you're happy. 

...if you haven't created custom transformations before then you'll understand during development why I'm proposing such an approach.

Babloo
Rhodochrosite | Level 12

Thank you for the Explanation. Do you want me to do this for all Jobs?

Patrick
Opal | Level 21

You said that all jobs will have a target table object which you can use as input for the custom transformation. If so then yes, you need to use the custom transformation in all jobs and you need to connect the existing target table to the custom transformation as input and the target table object as output. This fully replaces your current %include.

 

Make sure you get this fully working and tested with a single job before you start adding it to other jobs.

 

....and just as a thought: Make sure that the libname used for your common target table uses the filelockwait=<some value> option to allow for concurrency. And also make sure that the code you write to add/update rows in the target table performs well and doesn't lock the table for long (if it's a SAS table. If it's a DB table which allows for concurrency then this is less important).

Ready to join fellow brilliant minds for the SAS Hackathon?

Build your skills. Make connections. Enjoy creative freedom. Maybe change the world. Registration is now open through August 30th. Visit the SAS Hackathon homepage.

Register today!
How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 14 replies
  • 2490 views
  • 2 likes
  • 3 in conversation