BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
JohnJPS
Quartz | Level 8

We are trying to facilitate automated archival of artifacts during a "move to production."  Our process runs on our production environment and may include MAS components, the building of which involves work in Decision Manager.  We don't really care about DM for supporting MAS, but wanted to allow users to specify a DM folder, and we would automatically call the BRM_EXPORT* macros, and archive it all.

 

So, the question is - is there an easy way to invoke the BRM macros in one environment, when calling from a separate environment? (About all I can think of is remote procedure call to sasbatch, with a .sas file that specifies what to run), but that just seems a bit complex... can we call the BRM macros from any API accessible from C#?)

 

Thanks - John

1 ACCEPTED SOLUTION

Accepted Solutions
JuanS_OCS
Amethyst | Level 16

Hello @JohnJPS,

 

here my ideas to you:

 

1- Easiest, but might require additional SAS license: to make use of SAS/CONNECT if you do have it on both environments.

2- Create a scheduled task (local, in source environment) that runs every X time and checks for an empty file. If present, executes the export and places results in a shared folder (to the destination environment), of course deleting that empty file once the process is finished.

3- Use remote commands such as sysinternal tools if you are in Windows, or ssh if you are in Linux.

4- A funny but interesting option: create a stored process and convert the stored process into a Web service, what would allow you to call it with normal HTTP request (soap, json) or rest API.

 

The trick is to create as small and simple components as possible, with easy communications/interfaces too. 🙂

View solution in original post

3 REPLIES 3
JuanS_OCS
Amethyst | Level 16

Hello @JohnJPS,

 

here my ideas to you:

 

1- Easiest, but might require additional SAS license: to make use of SAS/CONNECT if you do have it on both environments.

2- Create a scheduled task (local, in source environment) that runs every X time and checks for an empty file. If present, executes the export and places results in a shared folder (to the destination environment), of course deleting that empty file once the process is finished.

3- Use remote commands such as sysinternal tools if you are in Windows, or ssh if you are in Linux.

4- A funny but interesting option: create a stored process and convert the stored process into a Web service, what would allow you to call it with normal HTTP request (soap, json) or rest API.

 

The trick is to create as small and simple components as possible, with easy communications/interfaces too. 🙂

JohnJPS
Quartz | Level 8
I actually went with #4... what I'm doing is pretty trivial and it seemed like such overkill to do it any other way. The stored process is just a few lines of base SAS, and calling it via SASBIWS/json is a great, simple solution. Thanks!
JuanS_OCS
Amethyst | Level 16

Good choice! I would have gone for that one as well 🙂 Glad to hear it helped

suga badge.PNGThe SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment. 

Join SUGA 

Get Started with SAS Information Catalog in SAS Viya

SAS technical trainer Erin Winters shows you how to explore assets, create new data discovery agents, schedule data discovery agents, and much more.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 3 replies
  • 1088 views
  • 3 likes
  • 2 in conversation