500 GB of what? Memory, disk space, I/O throughput?
I have processes that can occupy more than 500GB of my disk when executing them. My question is if there is any way to program in SAS when they come to occupy a lot of space and kill the process.
would it be of value to see if the process could be broken into smaller steps?
maybe process 1/4 of the data and then cycle back and do the next 1/4 of the data until it finishes.
or does the end result require the 500 gbs?
The data can't be divided into small parts. And I want to control the processes that are executed by placing a cap. This with the idea that a single process does not occupy all the space since there are more processes running at the same time.
Are you talking about SAS WORK space here? If so and your jobs normally can take up 500GB then you should simply size the amount of WORK space on your SAS server to ensure you don't run out of space.
If on the other hand only badly-behaving jobs are causing this problem then perhaps you could consider having a WORK disk quota per user. This is done with OS settings so how you do it depends on what OS your SAS server is running on. Note disk quotas will cause any SAS jobs going over the limit to fail with out of disk space errors for the users going over their limit.
I've raised an issue here: https://github.com/Boemska/worktop/issues/6
I'll let you know / update this thread once it's implemented. Or someone else can have a go and send a PR 🙂
Nik
Go the path & execute this command manually.
./cleanwork /saswork/SAS/
The SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment.
SAS technical trainer Erin Winters shows you how to explore assets, create new data discovery agents, schedule data discovery agents, and much more.
Find more tutorials on the SAS Users YouTube channel.