BookmarkSubscribeRSS Feed
Obsidian | Level 7

Have you ever wondered who is hogging all the storage? Below are a few quick scripts you can consolidate in a .sh file that will expose data hoarders and will help you keep your environment nice and tidy!


echo -e "------------------------------------\n

Storage Snapshot on `date`\n


echo "Storage Snapshot"

sudo df -h

echo -e "------------------------------------\n


Top Storage Usage by SAMBA Share /mnt/storage \n


sudo du -hs /mnt/storage/* | sort -hr | head

echo -e "------------------------------------\n


10 Largest Files on SAMBA Share w/ Last Mod Date - /mnt/storage \n


sudo find /mnt/storage -type f -exec du -Sh --time {} + | sort -rh | head

echo -e "------------------------------------\n


Top Storage Usage for SASHOME /home\n


sudo du -hs /home/* | sort -hr | head

echo -e "------------------------------------\n


10 Largest Files in SASHOME w/ Last Mod Date - /home \n


sudo find /home -type f -exec du -Sh --time {} + | sort -rh | head

echo " "echo -e "------------------------------------\n


SAMBA Usage Report \n


sudo smbstatus –b



Email output:

~$ ./ > ~/storage.txt

~$ mail -s "SAS Storage Snapshot for SASApp on `date`" sasadmins < /home/userid/storage.txt


* sasadmins is an alias set in the /etc/aliases


~$ sudo vi /etc/aliases

            add the following:

                     sasadmins:          email1@address, email2@addres...

~$ sudo newaliases – this updates the config so you can use the new alias

Lapis Lazuli | Level 10

This is very useful, thanks for sharing. Smiley Happy


Other selection criteria we can propose for also sparing storage space and report to the (profligate) user :


1. identifying (candidates for) duplicates in SAS libraries and the corresponding storage space


2. identifying SAS tables which are not compressed etc.


3. identifying data not accessed for some given time (6 month, 12 months rolling etc.) 


Logically, the third condition (aka "cold data") must be applied before 1) et 2)  since

requests (like SAS scripts) based on reading table descriptor portions imply an update of the underlying

file timestamp which erases the previous access time information. I'm the dummy who

realised the logical mistake afterwards so I know it firshand Smiley Frustrated.



Obsidian | Level 7

I agree. I do a backup to an AWS EFS share to archive user data that is no longer needed immediately. Users can always pull the data from the archived backup if they need it in the future. 

suga badge.PNGThe SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment. 

Join SUGA 

Get Started with SAS Information Catalog in SAS Viya

SAS technical trainer Erin Winters shows you how to explore assets, create new data discovery agents, schedule data discovery agents, and much more.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 2 replies
  • 2 in conversation