07-22-2013 01:39 PM
Does anyone know the recommendation for the server RAM?
Say about 200 users, about 5-7 heavy power users on a daily basis, working with several billion records each.
Our current server has 6GB allocated and I want to find an argument to have them increase that. We currently have about 400GB in the work library, that is shared.
07-22-2013 02:41 PM
The minimum memory requirements for SAS are documented for each SAS release in the systems requirements documents available at SAS System Requirements. Those minimum requirements vary on the use of the system and the products installed as well as the SAS release.
From my experience for SS 9.1.3 through 9.3 8GB has been a common amount of memory for installations including the SAS Enterprise BI capabilities. System Requirements documents for 9.4 are recommendiung 16GB.
Note also that you have tagged this question as "IT Resource Management" which is a SAS solution. Possibly this question would be better commented upon if it were sent to a SAS installation or administration audience.
07-22-2013 03:55 PM
Typically, 4GB RAM/CPU or Core is what I have seen recommended by SAS's Enterprise Excellence Center (EEC) sizing documentation to start with as a minimum for workload similar to what you have described in your original post.
To cut the chase and avoid the guessing game, I would strongly suggest contacting your SAS rep and asking about the Free Sizing Service SAS institute used to offer for their clients.
One other useful resource would be reading the following SAS Global Forum 2012 paper
"Guidelines for Preparing your Computer Systems for SAS®" (http://support.sas.com/resources/papers/proceedings12/363-2012.pdf)
Hope this helps,
07-22-2013 05:26 PM
What operating system is on your server and what version of SAS? These have a large bearing on memory requirements.
If you are running 64-bit Windows then the recommendation is at least 4GB per CPU core. So if you are running a small 4-core, 2 CPU server then at least 16GB would be recommended. Please note most modern CPUs have multiple cores.
Here is a very useful reference: http://support.sas.com/resources/papers/WindowsServer2008ConfigurationandTuning.pdf
07-22-2013 05:29 PM
No, its not Windows, its AIX server I believe.
I don't actually know much about the server side, just that it seems slow as heck to me and I want to ask for them to fix it somehow, besides saying "make it faster". I think the RAM is an issue, because its 6GB across so many users. At home on my laptop (Windows) I run programs on millions of obs in a few seconds (8GB RAM) while here at work on the server it takes minutes.
07-22-2013 05:41 PM
Keep in mind, Having the resources (CPU, RAM) on the Server doesn't mean SAS is fully utilizing them!! It has to do with the sasv9.cfg / sasv9_local.cfg configuration.
The best way to see what your SAS Server Session has allocated to it, is issuing the following statements.
OPTIONS FULLSTIMER MSGLEVEL=I;
PROC OPTIONS GROUP=PERFORMANCE; RUN;
The Proc Options output will tell you how much resources allocated to your SAS Server Session.
07-22-2013 07:08 PM
In that case this paper is of more relevance:
The minimum RAM for a SAS Compute Server is recommended as 16GB for 4 cores, and more if there are more cores. I suggest you find out how many cores your AIX server contains then multiply that by 4GB to come up with a recommended total RAM.
07-23-2013 01:54 AM
The mentioned paper (Margaret Crevar / Tony Brown) is a good one. An important issue is the monitor monitor monitor sentence (understand your system). This is often ignored as it will cost effort with no obvious result.
Monitor tools like NMON are giving a lot (interactive & collectors for daily monthly trends). Monitoring has to start at OS level. The Tuning of a SAS installation is adding that part of SAS as second (dependant).
The SAS automatic macro variables are containing some indicating the used OS version.
you could make a list of that, just to be sure. Other setttings (options like max memory) are setting limits on the OS resources. With Unix (Aix) they are mostly set not that high, at a Windows desktop it set/left more open.
Processor speeds are not really different (stable last 6 years) capacity must come from the number of cores.
Running a BI/DI (metadataserver) and webserver with all compute servers on a 6Gb memory would be challenging. The OS has to run the metadataserver has to run the webserver with javacontainers the SAS system and at last on top of all the SAS (compute/user) processes. The first needed 4 parts are easily needing arround 8Gb of ram. Shortage of ram is recognized by a thrashing, the OS memory manager will be very busy and that can become all he is capable off. So starting at 16Gb is a very sensible choice.
The next thing, having enough memory: check your IO (more in detail saswork). IO is by default cached and by that consuming RAM. Accessing big files would need big caches to fit, otherwise it will cause unnecessary overhead. Segregating Saswork optimizing IO throughput is the next challenge.
There is a lot to review and to do for getting en optimal system.