@ijm_wf:
1. When you're writing something of this nature, you need to account for:
(a) the possibility that while your program cannot open one of the data sets because it's currently locked for an update and make sure that the checking routine itself doesn't get aborted when it happens (which most likely will)
(b) while your program is waiting, it doesn't gobble up CPU time needlessly - which means that an infinite DO cycle is ruled out and you need to use the SLEEP function, making your session hibernate harmlessly for the preset period of seconds
(c) the possibility that the updates finally making all the dates equal and non-missing can take an inordinate time - and so set some reasonable overall waiting limit
2. I don't see any reason to think of the task in terms of a macro when all this can be done much easier and more cleanly in the DATA step. For example:
data file1 file2 file3 ;
run_date = 1 ; output file1 ;
run_date = 1 ; output file2 ;
run_date = 1 ; output file3 ;
run ;
data _null_ ;
retain date_var "run_date" seconds_slept 10 wait_limit 30 ;
array ds $ 41 ds1-ds3 ("file1" "file2" "file3") ;
array dt dt1-dt3 ;
do N_tries = 1 to divide (wait_limit, seconds_slept) ;
do over ds ;
id = open (cats (ds, "(rename=", date_var, "=", vname (dt), ")")) ;
if id = 0 then leave ;
call set (id) ;
rc = fetchobs (id, 1) ;
end ;
if id ne 0 and nmiss (of dt:) = 0 and mean (of dt:) = dt1 then do ;
put "All " date_var "values are equal and non-missing." ;
stop ;
end ;
rc = sleep (seconds_slept) ;
end ;
put "Wait limit " wait_limit "seconds is exceeded." ;
run ;
You can add more bells and whistles to it, such as logic to write a message in the log when a data set cannot be open or periodically reporting to it on N_tries, so that the process can be monitored. Note that I set WAIT_TIME to 30 seconds (i.e. limiting N_tries to 3) for the sake of testing, but you'll obviously need to set it for a much bigger, albeit reasonable, number.
Kind regards
Paul D.
... View more