Hi. We have just received our new Dell machine and have SAS 9.2 running on it. It has Windows 7, 8 processors that are hyper-threaded, and 24GB of memory. On a daily basis we are pulling data and running models from a table with around 170MM records. At this point the new server isnt running any faster than any of our other old machines when it comes to SAS. When something I run finishes the cpu time and real time are off by more than 50% so obviously something isnt set right. We do alot of sorting, indexing, and loops. Can someone help me out on getting the correct configuration set up? I'm thinking it may be the .cfg file?? Thanks in advance for any help!!
You mentioned you are processing reasonable quantities of data but you haven't mentioned how your storage is configured. Where is the data you are reading coming from, where is it being written to, where is your work library pointing? Are they on SAN, network file systems, local single disks, local RAID0/1/5/10 arrays etc. Are they all on the same device or on separate devices?
I don't know exactly what type of processing and models you are running, but in my experience some of the best improvements in elapsed time, when processing non-trivial amounts of data, usually come from optimizing I/O subsystems. Do you know if your processes are I/O bound, CPU bound or memory bound? If you have increased CPU and RAM with no apparent change and have made no changes to I/O then it suggests perhaps I/O bound.
Once again, I don't know your environment that well, but based on what you have said so far that is where my thoughts would be leading. I would usually recommend running tests to determine where bottlenecks are before deciding on what type of hardware to invest in.
Hey Paul. I'm running local Raid0. The data is on a seperate drive on the system called D: and its writing to that drive and the library is pointing there as well. The biggest tests that I have run are sorting the 170MM record table by Zip9, building an index on Zip9, and making a brand new table off of the 170MM records. Not very impressed with the results. Can you talk about optimizing I/O a little? I really appreciate your help!!
I am only suggesting that I/O is one of the additional areas to look at. I would recommend watching the process whilst it is running and getting an idea of CPU, memory and I/O utilization for the process.
I assume your D: drive is on the RAID0 array? Do you know how many disks you have in the array, what type of disks they are and what type of controller they sit on? Do you know where your SAS WORK library is configured to go? Is it also targetting the same RAID0 array or a different storage device?
What sort of results are you getting? Do you have a SAS log fragment you would want to post that includes the code that was run, and elapsed times obtained? Can you tell us how many gigabytes your SAS datasets are in the file system and how much elapsed time it takes to create them from something like a proc copy? This should give us a very rough idea of average I/O throughput for the process.
Is there any other concurrent processing occurring on your machine that might be competing for resources with your SAS program? When you run your benchmark tests are you confident that there is nothing else running on the machine?
BTW out of interest I assume that with that much RAM you are also using 64-bit SAS on 64-bit Windows? I also assume your SAS processing occurs on the physical machine and not in a virtual machine?