06-06-2016 02:41 AM
We are using SAS 9.3 based on unix plateform. The todal size avaiable at home directory is 86 GB out of which 83 GB is occupied. We need to remove files from /home directory.
After checing the file sizes, we found that there are three users under /home/ACC directory which uses 24GB each. We checked more and found that these big files are specific to a perticular Workspaces.
Now could you help me to to identify what communication should I send to these three users. I am not sure what they need to check. These users uses mostly SAS EG and Miner.
06-06-2016 03:21 AM
Where would you like the users to put their large files? Often, there is an answer to that which doesn't involve the home directory. The users should be told that SAS data sets do not belong in the home directory, but belong in the directory that you tell them. It wouldn't hurt to mention that the home directory is almost full and what the consequences are once it fills up.
06-06-2016 03:51 AM
06-06-2016 04:21 AM
Why is SAS involved in removing a file? The users are logging in to Unix first, before starting up SAS aren't they? Within Unix they can remove a file.
06-06-2016 04:32 AM
We have asked from users about these files. As per them these files can be removed. But they are not able to remove these from SAS EG---> Servers---> Libraries--->Files.
They are getting below error message:-
The requested operation could not be performed by SAS server.
It's probably time to dig deeper into this.
If you are responsible for the server, log in as superuser and inspect the files (and directories) in question on the OS level. This will reveal ownership and access permissions, as that might be responsible for the problems in removing files.
If someone else is responsible for administration of the SAS server, work with them to identify/remove the problem files. Your personal userid will most probably not be sufficient to administer other user's files.
06-06-2016 04:28 AM
First of all, if these are actually WORK libraries that have been filled:
- move WORK locations from $HOME into a common directory on the fastest storage you can provide, and set that location up with enough space to satisfy reasonable usage (depends on source dataset size that people work with)
- define a quota for users so that runaway codes cannot fill the WORK location so that other users are affected
- run the cleanwork utility (provided by SAS) for this directory regularly (you will have to use the superuser's cron for that); at least daily, hourly is better. This takes care of WORK directories left by crashed workspace servers.
If these are not WORK library remnants, look for query_for* or similar files, as these are left by EG query tasks and can usually safely be removed if considerably old.
Notify users of the compress=yes option in case the datasets in question contain large character fields that are mostly empty. This will reduce storage use and I/O load.
Example from real life:
I manage ~150 users with a 80 GB home directory; the standard quota for users in /home is 1 GB
WORK is on a fast disk with ~140 GB, users have a 9 GB standard quota there. cleanwork runs regularly.
Storage for session-to-session preservation of data has also been provide, with larger quotas; user datasets there are automatically cleaned after 5 days.
No backup is done on any temporary locations, but RAID 1 is in place for business continuity.