06-16-2015 01:59 AM
I'm running SAS 9.4 on Windows 7 64 as a VM on Paralles on a MacPro, 12 cores, 64GB RAM. I've assigned 8 cores and 48GB to the windows 7 VM.
I'm using a tool for linking data sets, which uses SAS as data base. The data are stored on an external 1TB SSD, connected with thunderbolt2 to the MacPro. The tool generates a new table using SQL scripts on two tables and starts 4 sas.exe processes, which only use about 100MB of RAM each. The data sets used are around 40-60MB with 400'000 records each. The tool splits the process into chunks and creates temporary sas-files of 300-500MB, one per running process. These files could easily be processed in RAM. Instead, SAS writes the files to the hard drive. I found a lot of I/O in the \temp\SAS Temporary Files\ folders.
How can I force SAS to use the existing RAM?
I've been playing around with
- memsize (between 4G and 40G)
- memmaxsz 4G
- sortsize 2G-12G
- buffsize 8k-32k
- bufno 100 - 500
Nothing has improved the over all performance and no setting motivated SAS to use the available RAM.
Any ideas how I could increase performance by using the existing RAM?
Thanks for any help.
09-13-2015 10:17 AM
a pity that you donlt use Unix-based OS, because you can easily define a RAM file system.
Anyway, SAS still provides you great functionality to improve your performance a lot. You may want to read the following links:
You can focus on using MEMLIB and MEMCACHE. For this, I love this paper: "Why Aren’t You Using MEMLIB?"
Further more, if you would like to go for more advanced features, you can give a look to the LASR server licenses.
Let us know how it goes.