Dear fellow admins,
We are running SAS 9.4M4 with SAS Grid manager on Linux (RHEL 7.3). This works fine as it is but we are facing a next step in the scale of our implementation. We are planning for what we call a multi-tenant environment where several organisational units will work in blissful isolation from each other. This separation used to be achieved by using separate server contexts until the grid options sets came along. The way grid options sets work taking identity and group membership into account if a perfect match for our multi-tenancy. It allows us to control which queues are being used and what SAS options are in effect. Very powerful stuff and it helps with the separation as far as workspace servers and stored process servers is concerned (I think these are all IOM servers). The last piece of the puzzle is the LSF batch processing. The batch datastep server is not an IOM server so a batch job cannot be controlled with a grid options set.
I have read this article on integrating the schedule manager with grid manager. This explains how to use sasgsub to run the batch jobs. This command would allow us to attach grid options sets. The sasgsub command wants metadata credentials via the metauser/metapass options. the article says
METAPASS should be changed from _PROMPT_ to the encrypted value of the password corresponding to the METAUSER value
But how can we obtain a metapass value from the sasbatch.sh script in a secure fashion? We don't know the user passwords so this sounds a bit too easy. And no, we do not want to run all batch jobs with a single administrative user; it must work for individual users. Has anyone cracked this or found a better way? I am looking forward to your input.
Regards,
-- Jan
Hello @jklaverstijn,
about the metapass, have you thought already about secured macros and/or securing some files? Anyway, probably that is not the way to go, just a quick idea.
Today, at 20:00 NL time, there will be a webinar "Scheduling SAS Programs and Jobs for a Grid Environment". Will you join?
Sorry to hear it did not. Could you follow up afterwards with the presenter by mail?
Also, perhaps @AndrewHowell can tell you better, he presented many papers on those kind of topics.
Something else: would it be interesting to create several Batch Servers (on the same SASApps) and then you can assign priorities during scheduling time? Then, each batch server can have its own queue. Another idea that just came out.
@JuanS_OCS wrote:
Something else: would it be interesting to create several Batch Servers (on the same SASApps) and then you can assign priorities during scheduling time? Then, each batch server can have its own queue. Another idea that just came out.
Unfortunately there can only be one data step batch server per context. So currently multiple server contexts would be the closest to this. It's just that the whole idea of grid options sets is to eliminate the need for multiple contexts.
Hello @jklaverstijn,
you mean technically speaking or according to agreements with your users?
I say this because, technically speaking, it is true that only one IOM server (WKS, STP, PWKS) is allowed per SAS Application Server (or context as in your terms), and I agree on your statement regarding the Option Sets but, as you said, that works only when the servers does connect to metadata. In other hand, it is also true that it is possible to create several batch servers under a same context. If you do this, you can tell in SASDI your jobs to use specific Batch servers.
Hi Juan,
@JuanS_OCS wrote:
..., it is also true that it is possible to create several batch servers under a same context. If you do this, you can tell in SASDI your jobs to use specific Batch servers.
Can you explain how this can be done? When I try to do that in SASMC I get this list of servers that can be added to the context.
A DATA STEP Batch server is not one of them. Is there some other way?
Regards,
- Jan.
Hi Jan, @jklaverstijn,
preferably, with SAS Deployment Wizard, to ensure all the metadata ID mappings are correct. You can do it with SMC as well, but keep in mind you will need to re-create the OS folder structure and adapt the paths in the files (config and executables). Also, life-Cycle-wise, SMC option, most probably won't be included in migration packages
Does this help to you?
Best, Juan
Hi Juan,
Thanks for bearing with me. No matter how I do it I cannot get a second data step batch server added to a server context. If a server context already contains one it is no longer offered in the list of new server components in SMC. Likewise, when using SDW, when I add a batch server and get prompted for a server context the drop-down selection list is either empty or only shows contexts currently without a data step batch server.
We still feel this would be an excellent solution but for now it does not appear to be a configurable option.
We have talked to SAS Support and they are looking into new possibilities that seems to be new to M5. I will update this thread if anything useful comes up. Or a definite no.
Cheers,
-- Jan.
So after much deliberation our SAS TS track gave the final suggestion. LSF offers a mechanism called esub that I was unaware of. It allows for overruling certain options to the bsub command that finally calls sas for batch and grid jobs. An esub is a script (shell, perl, python, ...) that can write to e predetermined output file that will contain new settings for a list of parameters, including the name of the queue. This allows one to use userid, group memberships and what have you to assign the job to a queue in a very clean and transparent fashion. See this IBM Knowledge base article for details.
We will further explore this approach but I feel confident that this does what needs to be done.
Regards,
- Jan.
The SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment.
SAS technical trainer Erin Winters shows you how to explore assets, create new data discovery agents, schedule data discovery agents, and much more.
Find more tutorials on the SAS Users YouTube channel.