BookmarkSubscribeRSS Feed
Quentin
Super User

Hi,

I have a macro, and when it runs, I want it to determine which client is calling it.  So for a stored process, the client might be the Stored Process Web App, or WRS, or office add-in, etc.  If this code is running as part of a DI job, I want the client to be "DI Studio".  Ideally, I want the client to be DI Studio if it is actually running in DI Studio, and also if a DI studio job has been deployed and scheduled (I know in the later case DIS isn't really a client; its running as a batch job).

I had looked at the numerous global macro variables available to my deployed DIS jobs, and thought aha, I'll just use &applname.  Which I see created in log in some of the code generated by DIS:

21         %global applName;
22         data _null_;
23         applName="SAS Data Integration Studio";
24         call symput('applName',%nrstr(applName));
25         run;

But it turns out for some of my deployed jobs, this macro variable is not being created.

So I guess a few questions:

1.  What is governing whether or not &applName is created when a DI job is created/deployed?  I'm assuming it's some level of logging the admin has turned on, as it seems to be created in same section that does some performance monitoring.

  2.  Since I can't trust that &applName will always exist for a DI job, is there reliable way (macro variable or some such) for code to dynamically determine whether or not it is executing as part of a DI job?  I could switch to &etls_jobname for example, but don't know if that is always populated?

3.   Obviously for flexibility of logging etc, it will always be possible for an admin or some routine to turn on something that creates additional global macro variables.  But wondering in BI framework, if there is a list in the documentation somewhere that says "This application (SPWA;WRS;EG;etc) will alway create these global macro variables ...." and defines them.  Without that, it feels like all those global macro variables have a lot of useful information in them, and are certainly being used by SAS itself, but are sort of "use at your own risk" for SAS developers, in terms of backwards compatibility etc.

Thanks,

--Q.

BASUG is hosting free webinars Next up: Jane Eslinger presenting PROC REPORT and the ODS EXCEL destination on Mar 27 at noon ET. Register now at the Boston Area SAS Users Group event page: https://www.basug.org/events.
14 REPLIES 14
BrunoMueller
SAS Super FREQ

It has to do whether you have "collect runtime statistics" set to on or off.

Also be aware, that by default the code for deployed jobs does not have the code to collect runtime statistics.

Check Tools > Options -> General there is a checkbox "Override and disable performance statistics when a job is deployed"

Bruno

Quentin
Super User

Thanks Bruno,

Can I assume that &etls_jobname will always be created in a DI job, regardless of options checked ?

[ I'm a big fan of your %binaryFileCopy() macro !  Copy a file using a SAS program: another method - The SAS Dummy ]

--Q

BASUG is hosting free webinars Next up: Jane Eslinger presenting PROC REPORT and the ODS EXCEL destination on Mar 27 at noon ET. Register now at the Boston Area SAS Users Group event page: https://www.basug.org/events.
jakarman
Barite | Level 11

Quentin,
What do want to achieve? I know your relationship with IT-staff and the "platform admin" task is not one of being one cooperative team. (former posts).

Some technicals:

- SAS did open up the documentation for building own clients using .Net or Java.  Some are building those kind of applications.
  As this implies there is no fixed list of possible clients, that list cannot be defined.

- I would also like SAS did something better on documenting some interfaces with add-ons/interfaces.

  For example, the building of plugins with Eguide en the EGP file structure is not published.

- For logging there is a complete technical framework (log4j arm based) that can be activated (APM Audit Performance Measurement).

  The eventmanger is growing and has also his focus on the technical support staff.
- SIEM (Security Information Event Monitoring) and Performance & Tuning are special technical area's having requirements being set as "standard of good practice".
  Tuning & Monitoring are skills that have many relations with the involved other middleware and the underlaying OS.

  The high level requirements are those things like:  ISO27k Cobit HIPAA Sox-404 Basel. It is some pitty SAS-institue is missing too much of those backgrounds.
- The used SAS or other tool is not really important. It are the business processes risk/impact cost/profit (more..) that are leading.   

When you are getting a question that goes into that technical area...
Why are you not going to invest in getting a better relationship with IT-staff and your platform-admin? 

---->-- ja karman --<-----
Quentin
Super User

Hi Jaap,

First, I should clarify that I actually have a very good / cooperative relationship with our SAS Admin.  In former threads, you and I have discussed different approaches to balancing features/options/etc that should be set at the enterprise level, server context level, group level, developer level, or job level.  And the benefits of maximizing individual developer autonomy and benefits of maximizing enterprise standardization/efficiencies.  I see these as healthy discussions (with you and similar with our admin).  Perhaps I have gone too far in those threads in painting a "straw man" admin as "Mordac The Preventer", but that is certainly not representative of my own SAS admin.

In this case what I had assumed to be a server setting (change in logging) turned out to be a client setting which I responsible for, as Bruno pointed out.  I think this is a reasonable questions for the forums.

Finally, what is it I want to achieve:

SAS has the construct of a DI Studio job.  When a macro executes, it is often helpful to know what "environment/context/client" (used loosely, I do not mean which server context) invoked it.  For example, is this macro being invoked by interactive Display Manager SAS?  A batch job?  Enterprise Guide?  Stored Process called by Excel?    So in this case, I am looking for a reliable way for a macro to determine if it is executing as part of a DI Studio job (including a deployed job).  So if &etls_jobname is always generated by a DI Studio job, that is sufficient for my current need.  (Yes, I realize deployed DI jobs are also batch jobs.)

Thanks

BASUG is hosting free webinars Next up: Jane Eslinger presenting PROC REPORT and the ODS EXCEL destination on Mar 27 at noon ET. Register now at the Boston Area SAS Users Group event page: https://www.basug.org/events.
LinusH
Tourmaline | Level 20

And Stored Process. And Enterprise Guide. And DMS SAS.

Data never sleeps
LinusH
Tourmaline | Level 20

As Jaap, I don't see what you are trying to achieve. It seem  that you are confusing the current client and what tool generated the program/job.

Yes, &etls_jobname is always generated for DI Studio created jobs. But, there are tons of clients that can execute deployed jobs, such as SAS BatchServer, (via external Schedulers), SAS Stored Process Server, SAS Foundation (DMS) sessions, SAS/CONNECT sessions etc. So, what is that your macro is intend to do?

&SYSPROCESSMODE gives a little more ("SAS Workspace Server" for a DI Client), but not the full story.

Data never sleeps
boemskats
Lapis Lazuli | Level 10

Quentin,

I appreciate this isn't the answer you're looking for, but the way I've approached this problem when I've wanted to know which runtime (logical appserver context), rather than client, was running some code: I'd set a global macro or environment variable in the various runtime autoexecs / sourced shell scripts (ie. %let myservertype=workspace; in autoexec_usermods.sas under SASApp/WorkspaceServer/), and read it back in whatever code was being called, and then log that alongside some useful macro variables, depending on what I know is available for that given runtime. So, for example, if my global variable resolved to 'STPApp' I'd know I can log &_METAPERSON and some of the other vars that tell me which machine the web app was called from, etc... if my variable resolved to 'workspace' I'd know I needed to run a SYSGET to get the same information.

You'll find it difficult to find a standard way of identifying the client program calling a piece of code, as not all of them report this information in in a standardised way, and some not at all. However it is possible to build a macro with logic that starts with the method I describe above, and then uses the system-set macro variables for each type of session to drive some further logic (testing which macro vars are available, what their values are, and what certain system options are set to) to deduce the information you want to know from a given execution. It's not difficult, but you'd need to do something like run a proc options; run; %put _all_; for each runtime to find out what's available and build your logic.

hope this helps

Nik

Quentin
Super User

Thanks NIk,

What you suggest is very much what I have been doing (looking at system-set macro variables and environment variables to deduce the type of session).  And indeed, my initial approach was to run %put _all_ ; in each type of session, and make lists of the global macro variables in each, to develop my logic.

I was just unlucky that in the DI jobs I checked (both deployed and non-deployed), they all had &ApplName defined, so I assumed I could trust it would always be generated for a DIS jobs.  This assumption turned out to be false.  So now I am hoping that &etls_jobname will always be defined for a DI job.

This sort of work would be much easier if the documentation defined which macro variables were created by which clients.  As you say, without that it becomes a "look and see" process to discover, for example, that the office add-in sets &_Client=SAS Add-In for Microsoft office.... and &EG does not set &_Client, but does set &_ClientApp='SAS Enterprise Guide'.  And my fear is that because these macro vars are not documented, SAS may feel like they can change them from version to version without worrying as much about backwards compatibility.

All the more reason to develop utility macros %GetClient() %GetServerContext() %GetOperatingSystem() etc, so that logic which determines this environmental information can be encapsulated, and code can be made conditional upon environment when necessary.

I like your idea of expanding this to provide server context information via macro vars defined in the server autoexec.

Thanks,

--Q.

BASUG is hosting free webinars Next up: Jane Eslinger presenting PROC REPORT and the ODS EXCEL destination on Mar 27 at noon ET. Register now at the Boston Area SAS Users Group event page: https://www.basug.org/events.
jakarman
Barite | Level 11

Quentin, still not getting your original question. "You assumed a server setting (change in logging) turned out to be a client setting" There has a lot done before this question.

- What is the requirement the why's, when's and what with that change in logging?
- How did you come to the conclusion it must be a client setting?

Yes you are right it is very reasonable for discussions. But please tell a little more of what it is about.   A lot of people are interested.

What I know of DI is:

-  meant for ETL development. Generating code for batch-processing or to be executed as a Stored process.

- The generated code should go into a LCM approach with release management. As developer are normally not allowed to develop within a production environment.
   By that you will get several segregated environments you can define having different log-levels for each environment.

   This solution is already mentioned. (Nikola Markovic)
   I could add to check some sysget OS-variables when they may have meaningfull information. (the set on the prompt level)
- An interactive environment (DI Eguide) is having slightly different settings by default as a batch run. (error recognition/recovery)

- You can define pre/post processing code but that will get included everywhere as part of the code.

- When you are running jobs with DI it is possible statistics being maintained in the metadata.
  That is not possible when using a scheduler as the scheduler will handle that.

If you are needing some Audit and Performance information you have APM to be added at the server settings.

I have seen the clients (Eguide) being imbedding some additional code. It is part of the desktop settings. It is possible easier, when needed, to exclude (not Eguide, not ...) as looking for a generic setting. I am expecting is has not been designed at SAS and is constantly changing, SP old style / Sp new style , other-new clients.


---->-- ja karman --<-----
Quentin
Super User

Hi Jaap,

I think we're not communicating well.

When I noticed that one of my older deployed DI jobs did not have &applName defined, I assumed (guessed) it was because of a change in server logging at some point since that job had been deployed (a couple years ago).  There was little basis for this assumption, as I had not actually investigated what was the source of the &applName variable.

Bruno helfpully pointed out that there are client settings in DI studio ("collect runtime statistics" and "override .... when deployed") which can control whether or not the SAS code generated by DI studio creates this macro variable &applName.    It is likely that when I deployed that job, I had these options off.  And only noticed that &applName is missing now in this one job because I've just started using &applname.

So now, knowing that &applName is not always created for a DI Studio job, and &etls_jobname is always created for a DIS job, I have enough information for my need.  Even without Nik's helpful suggestion to consider adding more global macro vars to identify server context.

Thanks,

--Q.

BASUG is hosting free webinars Next up: Jane Eslinger presenting PROC REPORT and the ODS EXCEL destination on Mar 27 at noon ET. Register now at the Boston Area SAS Users Group event page: https://www.basug.org/events.
jakarman
Barite | Level 11

that is some other information as your original header. I know some settings are generated as DI client settings. 
These are used as a default settings that is going to be as job-properties.

With the latest version of DI there are some ehancements to mass change those setting so you can redeploy those job with correct settings.
I had those issues as some of them were causing including hardcode portnumbers servernames into the code. Very recognizable as being DI code but making it difficult to get those into a release management process. These settings/variables are not necessary and I believe going to be eliminated. 

---->-- ja karman --<-----
jakarman
Barite | Level 11

Quentin,
you list of macro vars is to be found at: SAS(R) 9.3 Stored Processes: Developer's Guide

The operatingsystem info/type is part of standard sasmacro-s : SAS(R) 9.3 Macro Language: Reference - The value of the OS is changing all the time
The servercontext if have seen being part of OS-scripts.
That approach idea of using that information from those I did some as service as SAS-admin for programmers.

---->-- ja karman --<-----
Quentin
Super User

Thanks Jaap, indeed that list is useful.  But note the top at the text:

"Some reserved macro variables are created automatically for all stored processes that are running on a particular server. Some are created by specific stored process client or middle-tier interfaces and are not created or available when other clients call the stored process."

Thus my suggestion that it would be useful to document which clients, and which mid-tier interfaces, create which global macro variables.

As pointed out in the setence before that, the possibility of global macro variable collisions is already there.  Given that there are so many different clients and interfaces that can create globa macro vars, one would assume that they all have unique names.  In which case it wouldn't be too hard to make a list documenting all of the global macro variables that these clients/interfaces create, and document which clients/interfaces create which global macro variables in which circumstances....

BASUG is hosting free webinars Next up: Jane Eslinger presenting PROC REPORT and the ODS EXCEL destination on Mar 27 at noon ET. Register now at the Boston Area SAS Users Group event page: https://www.basug.org/events.
jakarman
Barite | Level 11

Quentin, The list is of course not complete but as you see the columns where it is used you see that is already a combination.

I like documented items as they are easier to follow with release changes.

There are a lot web-based related (_session _debug) I recognize them the same as SAS/intrnet inherited from a web-server. You see some client=based setttings as MSoffice with the related office product. When we take this blog, SAS logs in Enterprise Guide: Where&#8217;s the beef? - The SAS Dummy you see the common EGuide macrovar settings.
You do not need testing for content just the existence (%symexist) should do for first recognition.

There are a lot more clients Eminer is adding a lot more. And every solution can have other ones.
It is that much I do not believe SAS is able to make that kind of docs.
For the SAS catalog types I cannot find a complete list either, there are also many many catalog-types.      

   

---->-- ja karman --<-----

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 14 replies
  • 1973 views
  • 5 likes
  • 5 in conversation