I have problem about the space in hardisk for sas temp files, I have to run data around 1.5 million lines every month, when sas is processing, sas must create temp file and it takes much time and space in harddisk and caused reduce the performance of processing, sometimes it shows the messeage that
"out of resource" press R to retry .....
How can i solve this problem for example ?
is it impossible to delete temp file during sas is running ? I means delete it manually by myself, I'm afraid that if i delete it, it will interupt the process.
is it impossible to use some option to tell sas not create temp files during running or after finish each process, delete it ?
Manually deleting work datasets while SAS is still processing is a dangerous efficiency.
You may want to look at some of the BBU. "Professional SAS Programming Secrets" comes to mind as useful here. You could also search support.sas.com under "efficiency" or "disk space".
Some initial ideas that effect the SAS temp file storage:
--You could add PROC DATASETS in your job stream to delete datasets you no longer need.
--If you are running out of space during sorts, you could change the sort parameters to do a TAGSORT.
--You could point your work directory to a disk drive with more disk space, hardware is cheap these days and adding a local disk drive may be better than using a network drive (this is changed in sasv8.cfg or sasv9.cfg).
--You could create datasets as VIEWs instead of work files.
--Judicious use of KEEP and DELETE can save lots of space.
--LENGTH statements are important in large datasets.
Except for the new disk drive, these solutions are going to require that you spend some 'quality time' examining and revising your code.
In addtion to darryliova's comment, if you are using Windows NTFS or Solaris 10, you almost an 80% reduction in disk space usage by using the OS compression. SAS's compression and the OS compression are not additive; you don't make any gains by adding the SAS option to the OS compression.