08-29-2013 02:50 PM
We recently had 32-bit Dell laptops swapped out with 64-bit ones (Win 7, 2.6 GHz, 8 GB RAM). Supposedly this should make my SAS programs run faster. But they aren't and they seem to be tying up all memory so I can't do anything else while they run. This wasn't happening with the older laptops. What can be done to improve things?
08-29-2013 04:13 PM
When you run a sas program, and you have your task manager open, does the memory used spike to 100%? Also, is it on specific tasks that this is happening?
Windows 7 is notoriously bad at handling memory when you are using a lot of it. It is MUCH better at using available memory more efficiently, but it also likes to gobble up a ton of extra memory.
Example: Open a 5 mB excel file, and hit the save button over and over again. You'll notice that windows 7 does not let go of the memory needed to save the file. This is known as a memory leak and windows 7 is full of them. This can be troublesome with SAS.
Also, do you have 64 BIT SAS and 64 bit office as well?
There are also settings you can set globally that tell sas "ONLY use up to X amount of memory" that way if it wants to use 10 Gigs, but you only want it to use 4 GIGS, the program will run faster but you will be to multi task again.
08-29-2013 05:09 PM
Yes, when the SAS program is running the memory used gets up to 98%. We do not have 64-bit SAS. I will do some googling on yhe global settings to see if that helps. Thanks!
08-29-2013 06:45 PM
You're welcome. Also I'm not sure if you're having compatability issues with 32 bit sas running on a 64 bit system (you might be).
That would be a good thing to look into, unfortunately I am not a software engineer so I am not sure if that is what is going on!
Keep me posted on if the global memory settings help!
08-30-2013 05:13 AM
Nice Headline, makes some fun on age or gender. Would say the females where claiming to be good at multitasking not the males. :smileylaugh:
You are asking soemthing about tuning&performance that is dedicated area.You need some system programmers background conform the old IBM definiton in a mainframe approach. Let me explain.
You have in het main concept approach sever type of resources in a computer.
- CPU (and all is related like GPU)
- Internal Memory (the short
- External memory (DASD SSD and other peripherals)
- The Connections (network cables bus) needed for transferring data between them.
All about perfromance&Tuning is about setting OS parameters (Windows) The middleware applictions (eg SAS system) and the application (you sas coding approaches) in a way it, is acceptable in wall clock time to your efforts (human resource time).
Some actions will cause a move of load from one resource to an other. Knowing wich resource is the most critical is important.
A common failure is thinking the CPU is the bottleneck. Take some money and replace that. wrong wrong....
That was in old days of the beginning wiht PC's some time true. The last 5 years the clock speed does not improve anymore.
This part is mostly the fastest part. The chip (CPU) is having some, no many,optimizations like cach memory. (Check type / description)Running at 2Ghz you could expect the most elmentary instructions being handles as measured in u-seconds
The improvement of faster processing must come now from parrallel processing. That is Multi-threading , GPU, Grid processing.
This is the one that is the slowest thing. A common expected access time of 10ms for head movement is the order of measurement.
That is a factor of thousands slower than the CPU is executing. By getting more and bigger storage on the classical dasd drives putti it all on a single device it is becoming relative slower. The capacity of this kind of storage is still growing fast. Storage of 1Tb as 1 drive is becoming usual.
This one is needed for your program code execution and for you data being processed. It is filling the gap between IO and CPU.
For the processing it would be very nice to have all data in memory by that bypassing the IO limits. It is very common that doesn't fit
You would probably have 8Gb internal memory and 500Gb of harddisk. Nicer would be a SSD of 250Gb.
On Servers you could think of 1Tb internal memory but having data of Pb's of size in a SAN.
A forgotten limit but all things has to be connected. Limiations can be from slow (external network) to fast (internal computerbus).
A 64 bit lap top with a 32-bit SAS system and when connectect 32-bit Office.
Your SAS system and office will run well as 32 version on that. They are limited by 32-bit for memory adresses that is 2GB.
The 64-bit OS is allowing to have multiple processes on the machine and giving all of the 8Gb to be managed to you application.
A 32-bit OS is limited to use just 3Gb (1Gb is reserved) wasting the other 5Gb of that being useless.
The speed or limits in your code/tool caused by a single thread requirement does not change. Expect nog gains.
Opening up your taskmanager will show you the number of logical processsors. How many dow you have got?
In Windows-7 has a prestation index giving some information. What do you have on all of that?
All the connections. That is becoming a nasty one.
You are asking the question like a user in a bigger organization.
- It could be SAS central installed. Resulting your saswork on a remote connection
- It could be bit-locker activated (slow down IO).
- accessing big datasets locally? You can run into cache behavior of Windows on that. Overloading memory resulting slow down all.
The description of te problem is looking like memory problems (trhasing) caused by something.
Could help to lower limits on memory somewhere and freeing it up so total throughput increases.
It is hard work and complex analyses tuning - performance It is Thinking in Multi-tasking