09-21-2011 04:55 AM
Is there any way to see which record is currently being processed in a data step that is inside rsubmit?
There is plenty ways of tracking the progress when we have a local data step, but I have no idea how to deal with remote data steps.
09-21-2011 09:30 AM
Depending upon how closely you want to track, you may only need:
09-21-2011 02:29 PM
Also, you can pass an altlog option to your sascmd statement in your signon statement.
signon task1 sascmd="/usr/local/SAS/SASFoundation/9.2/sas -altlog /mylogs/mylog.log" wait=no;
rdisplay is also useful as long as you are running you program interactivly.
09-21-2011 06:19 PM
You can also put comments in your log from within your DATA step like so:
if mod(_n_, 100000) = 0 then put 'Processing Row: ' _n_;
This will write a comment for every 100,000 rows processed. The difficulty with remote processing and this applies to batch SAS jobs as well is that the log is buffered and only updated when "sufficient" lines have been written.
09-21-2011 06:36 PM
SASKiwi makes a good point and references another topic that I had actually posted:
I have still not had an opportunity to look into using the write=immediate option for logparm.
I am not currently able to test but I know in my current enviornment the following will not work:
do i=1 to 10000000;
if mod(i,50000)=0 then putlog i=;
If the write=immediate option doesn't resolve the issue my next thought is to call sysexec and cat a note onto the end of the logfile (if I even can incase there is a lock on the file held by sas while using write=buffered?).
do i=1 to 10**6;
if mod(i,100000)=1 then call sysexec('x echo ' || put(i,comma.) || ' > /mylogs/tracker.dat'
09-21-2011 11:41 PM
Here's a reference to the SAS doc covering LOGPARM as discussed by FriedEgg:
I haven't tried the WRITE=IMMEDIATE option either but would be interested in feedback from anyone who has tried it.