SAS Data Integration Studio, DataFlux Data Management Studio, SAS/ACCESS, SAS Data Loader for Hadoop and others

DI Studio Deploy Job as STP, storing code in metadata

Accepted Solution Solved
Reply
PROC Star
Posts: 1,322
Accepted Solution

DI Studio Deploy Job as STP, storing code in metadata

Using DI Studio 4.901, I just noticed the ability to deploy a job as a stord process via: right-click job->Stored Process->New (9.3).  It brings up the familiar New Stored Process wizard, like SMC.

 

I can get this to work if I accept the default: Store source code on application server.  The job code is written to the .sas file specificed.

 

But if I change it to store source code in metadata, it doesn't work.  When I click next,  it errors with "You must click the 'Edit Source Code' button and provide source code for the stored process."  But of course, I don't have the source code, it's intended to be generated from the DI job.

 

Is it possible to deploy a job as a stored process, and have the STP source code stored in the metadata?

 

 

 

 

 

 

 


Accepted Solutions
Solution
‎06-29-2017 11:29 AM
SAS Super FREQ
Posts: 97

Re: DI Studio Deploy Job as STP, storing code in metadata

It looks like you have run into a situation where the reuse of a common module (the SMC wizard for creating stored processes) inside DI Studio forces you to add source code as if DI Studio was not in the picture at all.

 

If the you wanted to, you could open the job, generate the code, copy the code, then paste it into the window from the wizard that lets you enter the text for the metadata object though we wouldn’t really recommend this. The reason being that because if you went to redeploy the job from DI Studio, we don’t have code that will regenerate the SAS code and write it to metadata.  So even if you redeploy, the code in the metadata object will remain unchanged. And we provide no interface to the metadata object that the code is stored in.

 

The short answer is that what you are asking for is not possible in DI Studio today. We will look at this further and see if it make sense to add a new feature in a later release to address this behavior.

 

Ron

 

Ron

View solution in original post


All Replies
Solution
‎06-29-2017 11:29 AM
SAS Super FREQ
Posts: 97

Re: DI Studio Deploy Job as STP, storing code in metadata

It looks like you have run into a situation where the reuse of a common module (the SMC wizard for creating stored processes) inside DI Studio forces you to add source code as if DI Studio was not in the picture at all.

 

If the you wanted to, you could open the job, generate the code, copy the code, then paste it into the window from the wizard that lets you enter the text for the metadata object though we wouldn’t really recommend this. The reason being that because if you went to redeploy the job from DI Studio, we don’t have code that will regenerate the SAS code and write it to metadata.  So even if you redeploy, the code in the metadata object will remain unchanged. And we provide no interface to the metadata object that the code is stored in.

 

The short answer is that what you are asking for is not possible in DI Studio today. We will look at this further and see if it make sense to add a new feature in a later release to address this behavior.

 

Ron

 

Ron

PROC Star
Posts: 1,322

Re: DI Studio Deploy Job as STP, storing code in metadata

Posted in reply to RonAgresta

Thanks @RonAgresta, that makes sense.  It's not a big problem for my use case, as most of my DI jobs just %Include a MakeData.sas program.  So I can just as well make my stored process %include MakeData.sas.  I was mostly curious if I had missed something.

 

While on this point, my use cases is I have a scheduled DI job / LSF flow which runs an ETL job nightly. But I wanted to give users a way to launch the job ad-hoc to refresh data in the middle of the day.  These users will have metadata identifies because they are running stored processes via SPWA.  So my thought was to make a STP they could run to invoke MakeData.sas.  This seems to work fine, despite the fact that the STP runs on a different server than the ETL job.

 

Is there a better way you would thinik of for surfacing ETL jobs to naive users?  That is, giving them a way to start a job or flow? 

 

Even as a developer I don't have access to LSF Flow Manager, so that is not an option.  I create the DI jobs and deploy them, but they an admin account creates the flows and schedules them.

 

Just curious if I'm missing something more obvious.

☑ This topic is solved.

Need further help from the community? Please ask a new question.

Discussion stats
  • 2 replies
  • 170 views
  • 1 like
  • 2 in conversation