The mentioned paper (Margaret Crevar / Tony Brown) is a good one. An important issue is the monitor monitor monitor sentence (understand your system). This is often ignored as it will cost effort with no obvious result. Monitor tools like NMON are giving a lot (interactive & collectors for daily monthly trends). Monitoring has to start at OS level. The Tuning of a SAS installation is adding that part of SAS as second (dependant). The SAS automatic macro variables are containing some indicating the used OS version. you could make a list of that, just to be sure. Other setttings (options like max memory) are setting limits on the OS resources. With Unix (Aix) they are mostly set not that high, at a Windows desktop it set/left more open. Processor speeds are not really different (stable last 6 years) capacity must come from the number of cores. Running a BI/DI (metadataserver) and webserver with all compute servers on a 6Gb memory would be challenging. The OS has to run the metadataserver has to run the webserver with javacontainers the SAS system and at last on top of all the SAS (compute/user) processes. The first needed 4 parts are easily needing arround 8Gb of ram. Shortage of ram is recognized by a thrashing, the OS memory manager will be very busy and that can become all he is capable off. So starting at 16Gb is a very sensible choice. The next thing, having enough memory: check your IO (more in detail saswork). IO is by default cached and by that consuming RAM. Accessing big files would need big caches to fit, otherwise it will cause unnecessary overhead. Segregating Saswork optimizing IO throughput is the next challenge. There is a lot to review and to do for getting en optimal system.
... View more