BookmarkSubscribeRSS Feed
7 REPLIES 7
LinusH
Tourmaline | Level 20

So I get this right.

In the code for DI Studio jobs, you'll find some table names in the header comment.

For those tables, you wish to extract the table creation code further down in the job code?

It's probably possible to do anything with advanced parsing of the code.

What is the use case?

Data never sleeps
Pili1100
Obsidian | Level 7

I would like to extract the source tables so I can further use them in a makro for getting production data to test.
This will eliminate the need for manual copy/paste into Notepad++ 😊

LinusH
Tourmaline | Level 20

Question is how you wish call this macro, per DI Job, or a whole environment?

There is a metadata API you could investigate (data step function based).

If whole environment, or per schema, I would use DICTIONARY.TABLES as a base - assuming your tables are created and accessible in your production environment.

 

Data never sleeps
Pili1100
Obsidian | Level 7

I would like to run the makro per job and get the list of source tables shown in the job header. 

It would be preferably to have the IDs the joined further. In my example this means A000000.YYYYYYY1 , Y0000000.YYYYYYY2 and Y0000000. YYYYYYY3 

 

LinusH
Tourmaline | Level 20

If that's the way you wish to do it, you're up for some (advanced) text parsing techniques.

Good luck!

Data never sleeps
Pili1100
Obsidian | Level 7

That has been explored but was dreaming on a finding a easier way 😊

ErikLund_Jensen
Rhodochrosite | Level 12

Hi @Pili1100 

If I got this right, you want to automate the process of copying tables from Production to Test, so you have updated source data available for testing a given job.

 

How do you want the solution to work? - I would recommend a solution that works within DI Studio in tne Test environment. A utility job containing a user-defined transformation, where the user can copy-paste one or more Job-ID's from the Basic Properties pane below the Folder Tree into the Transformation's Options-tab. The transformation should do the following:

  1. Extract a list of all source tables with physical paths for the given Job-ID's, and then Loop over the tables and for each table,
  2. allocate a libname to the Test Path,
  3. another to the Production Path and
  4. copy the table using Proc Copy.

This will work if:

  • the tables are SAS Base datasets (sas7bdat-files),
  • the physical folder structures are identical in Test and Production, so the Production path can be derived from the Test path,
  • the physical Production folder is accessible from the Test environment, and
  • the user has permission to read the physical dataset.

Getting the list of all source tables with physical paths from metadata is a bit complicated, but not impossible, anf much easier than trying to get the information out of the generated source code. I can help with some code if you are interested, only give me a couple of days because I don't have anything ready that does exactly that. The metadata part will give you the necessary information, so you could implement a solution as a DI Studio job or wharever you prefer.

SAS Innovate 2025: Save the Date

 SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!

Save the date!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 7 replies
  • 1553 views
  • 0 likes
  • 3 in conversation