BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
rgreen33
Pyrite | Level 9

I have DI Studio Jobs with user written code that I need to promote from DEV to PROD.  Obviously, I can export/import as SAS Package...and that's what I have been doing.  However, once I complete the import, I then have to open the job, go thru the code, and change the lines of code that reference the server by name.  Is there a better way to do this...maybe something a bit more automated?  I am fairly new to SAS, so I may be missing the obvious. Smiley Happy

Thanks,

Ricky

1 ACCEPTED SOLUTION

Accepted Solutions
CHandberg
Obsidian | Level 7

So basically you just want your server-name as a variable?

 

You can use %sysget(computername)

 

Try this example:

%let path=\\%sysget(computername)\sasfolders\data;

%put &path.;

 

I hope this was what you needed...

View solution in original post

6 REPLIES 6
LinusH
Tourmaline | Level 20
Yes, avoid user written code. Since I don't know what your code is doing, I can't at this moment tell you how.
Data never sleeps
rgreen33
Pyrite | Level 9

The main thing that user written code is being used for is pushing data to LASR from Hadoop.  My data in Hadoop is stored as SASHDAT, while in LASR, it is stored as SASIOLA.  I have not found any other way to do this...other than with user written code.  I have even tried using Visual DataBuilder to create the job, but it simply creates a job that uses user written code.  Any ideas?

 

Thanks,

Ricky

LinusH
Tourmaline | Level 20
When the data is already in LASR format, I think loading it to memory is a DBA operation rather than ETL. So i think it's OK to use VA features lika auto load or PROC IMSTAT (which wouldn't be part of any DI Studio job, nor be subject for promotion).
Data never sleeps
rgreen33
Pyrite | Level 9

I'm not sure that I am following what you are saying.  Essentially, I have the following:

 

     Source (Oracle, SQL, etc.)  -->  HDFS  -->  LASR

 

I have scheduled jobs setup as follows:

 

     Source (Oracle, SQL, etc.)  -->  HDFS

     HDFS  -->  LASR

 

As previously mentioned, the real issue is with the HDFS --> LASR job.  My data in HDFS is in SASHDAT format, and I need to push it from HDGS to LASR, which will be in SASIOLA format.  The only way that I have found to do this is vai user written code using proc lasr add.  And, with this being the case, this causes an issue when promoting from DEV to PROD, as I have to manually change the code, which can be tedious when I have lots to move.

 

Ideas?

 

Thanks,

Ricky

CHandberg
Obsidian | Level 7

So basically you just want your server-name as a variable?

 

You can use %sysget(computername)

 

Try this example:

%let path=\\%sysget(computername)\sasfolders\data;

%put &path.;

 

I hope this was what you needed...

rgreen33
Pyrite | Level 9

@CHandberg,

 

THANK YOU!!!  That is exactly what I was looking for...just a minor tweak since I am in a Linux environment.  I added the following line:

 

     %let env_HOSTNAME = %sysget(HOSTNAME);

 

Then, I referenced this variable in my LIBNAME statements (and other places as needed).  Now, this will allow me to move these jobs from one environment to another without having to make any changes (since my necessary folder structure is the same between my two environments).

 

Thanks again,

Ricky

suga badge.PNGThe SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment. 

Join SUGA 

CLI in SAS Viya

Learn how to install the SAS Viya CLI and a few commands you may find useful in this video by SAS’ Darrell Barton.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 6 replies
  • 1315 views
  • 3 likes
  • 3 in conversation