03-27-2015 10:31 AM
Hello everyone. I have an ETL built in Base sas which runs by looking in a specific folder for a given file.
Currently the Base Sas code is scheduled using the windows scheduler on a server. It just checks the directory using base sas code similar to below.
filename _dir_ "%BQUOTE(YOURDIRECTORYPATH.)";
length filename $300.;
handle=dopen( '_dir_' );
IF handle > 0 THEN DO;
DO i=1 TO count;
filename _dir_ clear;
However I now have the task of bsically doing the same thing, but checking 160 directory paths at run time. Since it is so many paths to check, looping over the paths in the above code takes ~22 seconds. This is obviously way to long to just check 22 directory paths, and I was curious if there is a good way of speeding up such an execution, or if there is a different program that most people use to set up file listeners that might then call SAS code.
03-27-2015 11:03 AM
Maybe look at a PIPE of an OS command. It looks like all you are looking at are filenames in the directories and that could be quicker. Especially if all of the directories you need to examine are subordinate to a single directory then include a switch to list the subdirectories. Switches should include the full path and name.
As an added bonus you should be able to get modified date/time if that is of interest to process new files.
03-27-2015 12:27 PM
the fileexist SAS(R) 9.4 Functions and CALL Routines: Reference, Third Edition is a direct approach not opening a directory listing and than searching that one.