06-09-2017 07:27 PM
Using a new stand-alone computer that has 32 GB RAM, 6 core processors. Trying to tell SAS to go ahead and use as much of the resources as possible.
Trouble is, with the settings below, a job is SLOWER than with a previous average computer.
Please advise on which of these 'options' likely needs to be changed:
-DMSLOGSIZE = 999999
-LRECL = MAX
-MEMSIZE = MAX
-REALMEMSIZE = MAX
-SET SAS_NO_RANDOM_ACCESS = "0"
-SORTSIZE = MAX
-SUMSIZE = MAX
Thanks very much!
06-09-2017 08:17 PM - edited 06-09-2017 08:19 PM
I'm not an expert on this stuff, but if I recall you shouldn't set the REALMEMSIZE bigger than the amount of physical RAM your system has. I think there's diminishing returns on the various buffer related options and it's somewhat filesystem dependent on what's best. I think I just wrote a quick shell script that iterated over a range of values to try and find the sweet spot.
06-09-2017 08:18 PM
Defining MAX for so many options makes system great overhead.
try change options to:
-BUFNO 3 -BUFSIZE MAX -CPUCOUNT 6 /* is your PC with 6 CPUs ? */ -DMSLOGSIZE = 999999 -IBUFNO 3 -IBUFSIZE MAX -LRECL = MAX -MEMSIZE = 0 /* 0 means using MAX with HD if need */ -REALMEMSIZE = /* use default value */ -SET SAS_NO_RANDOM_ACCESS = "0" -SORTSIZE = MAX -SUMSIZE = MAX -THREADS -UBUFNO MAX
With so many options, you should try changing each time one option and check for performence.
06-09-2017 09:05 PM
06-09-2017 11:01 PM - edited 06-09-2017 11:02 PM
Performance tuning can be quite intense. What's going to work best for you really depends on your environment and also how you're going to use SAS, i.e. are your processes mainly I/O bound or mainly CPU bound, are they often multithreaded or not.
SAS tables for example are up to SAS Viya single threaded so how many CPU's you've got won't make any difference there.
Your first step will be to implement valid test cases for your actual SAS usage. You will then also have to make sure that there aren't any other processes running while you're testing.
Changing one option at a time, through all possibilities for each option,would take a year or so to complete
You could write a script which batch submits your test cases. And you could implement a loop which passes in different parameter values to the SAS batch command and then just collect the stats. That would allow you to test a lot of cases in a fully automated fashion.
06-09-2017 11:26 PM
If I were you I would start with the SAS defaults for the options you listed. How does performance compare to your previous computer just with defaults? You need a benchmark before you start tweeking and the defaults will provide that. There is a good reason for the default values - they are generally pretty efficient for most types of usage.
06-11-2017 07:29 PM
06-11-2017 07:56 PM - edited 06-11-2017 08:15 PM
06-11-2017 07:46 PM - edited 06-11-2017 07:51 PM
libname XX spde 'c:\temp\' partsize=200g compress=binary;
should work better:
- normally no need to split data files when there is only one path, I/O is usually the bottleneck, and you might as well keep it sequential
- much reduced I/O with SPDE's fantastic compression algorithm
06-10-2017 05:25 AM
Before you start changing settings, get a clear picture of your performance issues. This means using tools to watch memory & cpu usage, and i/o throughput, and the "busy" states of the disks. Misconfigured storage can slow a supercomputer down to a crawl.
Allowing SAS to use all resources might deprive the operating system of the chance to optimise resource use itself. SAS provides whitepapers for all supported platforms that deal with optimising the settings for a given platform. In my experience, they have been very helpful (SAS on AIX/pSeries).