Hi,
I have the next task, it's very easy to reproduce so I'll just shortly describe it:
From UI side I send asyncronly tow Ajax requests that creates two STP sessions, first STP session run macro that executes fro a long time(a few hours) and all this time macro writes Logs into log file that is on some folder on server.
Another STP session observe this log file reguralry(lets say each 10 seonds), and check if this log file is in use, if it is in use - sleep for a 10 second , if no - it will mean that first macro finish execution, then this second STP session parse the log file ,return needed results and send it via mail to needed person.
Very important that this process can't be scheduled as usual jobs, -there is UI that should be used for run these macro on any time.
Such functionality works perfect, except one detail - these two STP session works long time, so they lock(book) two processes(that manage STPs) for themself, for a long time, so if another user run STP in same time responce time will be bigger, or new STP can stay in queue after some of these 2 locked STP session and therefore will wait long time after it finished, that is not acceptable...
So, what I need is to execute these tow macroes(one that executes long time, second that observe log file from first macro) in different non STP SAS session that will not impact another STP execution, it probably should be some kind of LSF session(becouse, as I know, it also executes deployed sas code , that factually are just sas files stored on server).
So , is it possible on fly from STP session creates some kind of LSF session ?
I tried to create new Session(from STP) using call system routine wich excute bat file with a few parameters(sas.exe, config file , sas file to execute), it creates but looks like the sesssion that it created is also STP session...
Maybe some rsubmit - endrsubmit statement should be used?
I would appreciate for any help(tips, links) etc.
Hi All,
I've solved the problem without creating non STP session from STP.
So I decided join “batch processing” with “STP possibilities” to implement the task.
From STP I just create one table(or file, it’s not important) on some server path.
Then I created small flow via SAS management console, this flow has one file event, - it waits until the file(or table) created by STP arrive on specific path on server.
After this file arrives – Flow runs.
Flow has two jobs, first job run this long time executed macro, in the end this job delete this table(file) created by STP.
After this job finished successfully or fails - second jobs starts , this job executes parsing logs generated after first job executing, and send needed info to needed person via sms/mail etc.
Off course there are a lot of validations etc things . that I didn’t mentioned in this post, but shortly idea is from STP creates only some table(file) on specific place on server and create small and simple flow that waits for this file occurs(arrive) on this place.
As I said higher flow can be easy created and scheduled via SMC, and testing flow can be done from flow manager.
Benefits of such solution obviously – no need change any config files(session macros etc.) – easy maintain and support etc.
Hope this will help someone with similar tasks.
If I'm reading this correctly, your first STP starts, runs until finished, and then stops. Therefore, it's not a problem, and needs no changes.
Your concern is with your second STP, which is just sitting there in a "do nothing" state until the log file indicates not in use.
My first question is to ask whether you're sure that it is a resource drain? As far as I know, on any modern capable server a process sitting there mostly in a sleep state won't impede other processes. Have you done any testing to measure this effect? If it's very minor (which is what I would predict), just keep your code the way it is.
Tom
Hi Tom,
So you wrote:
"Your first STP starts, runs until finished, and then stops. Therefore, it's not a problem, and needs no changes."
Initially I think so, but looks like the STP server response time increase because of this one STP lock one(from a few) of the process that manage all STP requests.
I'm not familiar with STP server configuration, I can only suppose how it works from my not very long experience.
So, lets suppose you have UI application that has one button:) , and when user click on this button 10 Ajax requests asynchronously sends to STP server, STP server should process all of them and return to UI some Json or XML, but it's not so important now.
So Lets say 5 "sas" process serve these 10 STP sessions that should be processed in same time.Using firebug and task manager on server machine easy observe how these processes serves : All 5 "sas" processes serve first 5 STP sessions, then after some of session finished - it serve some another session that before waits in queue probably.
So, lets go back to my problem - When I didn't run my application(with two long time execute STPs) but run this application with one button - all these 10 Ajax request will finish sequentially successfully similarly as I describe higher.
Now, When I run my problematic application that creates 2 STP session that runs a few hours, - 2 from 5 processes become locked by them.
Now, when I run second application with one button - logically only 3 processes should serve these 10 STP requests, but in fact what I saw - 3 sas processes serve needed requests normally(finish one, then takes next etc.), but two of requests(that were allocated to 2 locked sas process mantioned higher) can be finished and run for a long time. And these two session will ends only after another , older STPs that runs for a long time will be finished.
So due all these I think next, what I do : UI -still run two Ajax to STP, but STP sessions will not executes these macros for a long time - they will just run some x(or call execute) command that will create non STP session that will execute these two macros without impacting another STP users.
You wrote:
My first question is to ask whether you're sure that it is a resource drain? As far as I know, on any modern capable server a process sitting there mostly in a sleep state won't impede other processes.
You are absolutely right, my second STP that observe logs from first STP uses sleep function that doesn't eat almost any cpu time, but anyway STP process is active, it doesn't unlocks on time,when sleep function executes, it lock sas process and keep it on serving only this one concrete session, as I understand.
Have you done any testing to measure this effect? If it's very minor (which is what I would predict), just keep your code the way it is.
Yes, I made the tests, I described results higher(UI with one button, 10 STPs and another UI with this 2 long-run STPs).
Tom, as I wrote higher I am not the specialist on Admin part of STP server configuration, I'm not 100% sure how STP session are distributed between "sas" process that serve them, but even from general logic it looks a little bit not elegant to lock two(lets say from 5) sas processes that manage hundreds of users STP requests(majority of them executes quick, second-two)...
I don't think that STP server purpose is manage long time processes, it rather some LSF batch stuff should do...
I'm not sure , but I know that "platform process manager" tool gives possibility to run some flow on fly, by some process manage command(jrun for example).
So I'll try from STP session create some x command that will send correct command to platform process manager that will run(on demand...) some flow with two jobs that will have UW code with needed macros(one run long executed macro, second - observe it logs)...
Paradox of this task is the fact that it's very similar to usual task, with only one difference - macro executes long time and therefore all problems occurs, but anyway it's the interesting things and I hope I'll finish it this year:).
Thanks!
Hi, Yura
Thanks for the enhanced description. I'm afraid you're out of my experience; I don't have any experience dispatching stored processes using Ajax.
I think your best bet might be to run this by SAS Technical Support, and see what they think about it.
Tom
Hi All,
I've solved the problem without creating non STP session from STP.
So I decided join “batch processing” with “STP possibilities” to implement the task.
From STP I just create one table(or file, it’s not important) on some server path.
Then I created small flow via SAS management console, this flow has one file event, - it waits until the file(or table) created by STP arrive on specific path on server.
After this file arrives – Flow runs.
Flow has two jobs, first job run this long time executed macro, in the end this job delete this table(file) created by STP.
After this job finished successfully or fails - second jobs starts , this job executes parsing logs generated after first job executing, and send needed info to needed person via sms/mail etc.
Off course there are a lot of validations etc things . that I didn’t mentioned in this post, but shortly idea is from STP creates only some table(file) on specific place on server and create small and simple flow that waits for this file occurs(arrive) on this place.
As I said higher flow can be easy created and scheduled via SMC, and testing flow can be done from flow manager.
Benefits of such solution obviously – no need change any config files(session macros etc.) – easy maintain and support etc.
Hope this will help someone with similar tasks.
Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.