The reason I'm trying to get millisecond precision is in fact to be able to properly account for all elapsed time - when I added up all the "real time x.xx seconds" values in the log, I obtained ~1.5 hours, whereas the overall real time of the program run, as written by SAS to the log upon conclusion of my batch run, was ~2 hours. This is a significant divergence. However, just adding +0.005 seconds to the real time of each of the (as mentioned, ~300000) steps would bring the sum close to 2 hours. So, if the values are arrived at by rounding, it's fairly safe to attribute the time gap to periods not considered in the times measured by SAS. However, if they arrived at by truncating, it's quite plausible to assume there are no such unmeasured periods. I hope this clarifies the rationale of my questions.
... View more