- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
I have created a dummy text to anonymize.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
So I get this right.
In the code for DI Studio jobs, you'll find some table names in the header comment.
For those tables, you wish to extract the table creation code further down in the job code?
It's probably possible to do anything with advanced parsing of the code.
What is the use case?
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
I would like to extract the source tables so I can further use them in a makro for getting production data to test.
This will eliminate the need for manual copy/paste into Notepad++ 😊
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Question is how you wish call this macro, per DI Job, or a whole environment?
There is a metadata API you could investigate (data step function based).
If whole environment, or per schema, I would use DICTIONARY.TABLES as a base - assuming your tables are created and accessible in your production environment.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
I would like to run the makro per job and get the list of source tables shown in the job header.
It would be preferably to have the IDs the joined further. In my example this means A000000.YYYYYYY1 , Y0000000.YYYYYYY2 and Y0000000. YYYYYYY3
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
If that's the way you wish to do it, you're up for some (advanced) text parsing techniques.
Good luck!
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
That has been explored but was dreaming on a finding a easier way 😊
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Pili1100
If I got this right, you want to automate the process of copying tables from Production to Test, so you have updated source data available for testing a given job.
How do you want the solution to work? - I would recommend a solution that works within DI Studio in tne Test environment. A utility job containing a user-defined transformation, where the user can copy-paste one or more Job-ID's from the Basic Properties pane below the Folder Tree into the Transformation's Options-tab. The transformation should do the following:
- Extract a list of all source tables with physical paths for the given Job-ID's, and then Loop over the tables and for each table,
- allocate a libname to the Test Path,
- another to the Production Path and
- copy the table using Proc Copy.
This will work if:
- the tables are SAS Base datasets (sas7bdat-files),
- the physical folder structures are identical in Test and Production, so the Production path can be derived from the Test path,
- the physical Production folder is accessible from the Test environment, and
- the user has permission to read the physical dataset.
Getting the list of all source tables with physical paths from metadata is a bit complicated, but not impossible, anf much easier than trying to get the information out of the generated source code. I can help with some code if you are interested, only give me a couple of days because I don't have anything ready that does exactly that. The metadata part will give you the necessary information, so you could implement a solution as a DI Studio job or wharever you prefer.