Hello,
Part of a program I am working on takes an excessive amount of time to complete; CPU time is slightly over seven minutes, however, real time has been over two hours when run against the full data set of approximately two million records. The program will be scheduled to run daily when complete so any efficiency we can gain will be worth it in the long run. I am working in SAS EG 7.13 via Citrix and there are some known performance problems with our SAS GRID 9.4 at the moment, however, I am hoping there may be a way to better optimize the code below to reduce at least some processing time:
/* measure elapsed time and determine sla performance */ data work.out; set work.in; format SLA_Performance $Char6. Reporting_Date Extract_Date date9.; /* determine reporting_date */ if disposition in ('Completed','Canceled') then do; reporting_date = datepart(Last_Modified_DT); end; else do; reporting_date = today(); end; /* measure elapsed time */ real_time = intck('days', DatePart(Submitted_Date), reporting_date); bank_time = intck('bankingdays', DatePart(Submitted_Date), reporting_date); /* determine SLA Performance */ if SLA_Date = . then do; SLA_Performance = 'NA'; end; else if SLA_Date >= reporting_date then do; SLA_Performance = 'Within'; end; else do; SLA_Performance = 'Over'; end; /* record extract date */ Extract_Date = today(); run;
One thing to look at:
if SLA_Date = . then do; SLA_Performance = 'NA'; end; else if SLA_Date >= reporting_date then do; SLA_Performance = 'Within'; end; else do; SLA_Performance = 'Over'; end;
Which of those first two cases occurs most frequently? If you can set it as the first condition then the whole block might runt faster.
Example if SLA_date is missing on only 1percent of the records and SLA_date > reporting_date on 80% of the records reversing the order of comparison would reduce the number times the SLA_date is compared to . a whole bunch.
If you move the line Extract_Date = today(); to before the first If then you could use
reporting_date = Extract_date; instead of the overhead of calling TODAY(twice).
Similar with calling DatePart(Submitted_Date) twice. I might run faster to create a temporary variable by calling datepart once and using that value in the INTNX calls.
Another consideration may be questioning the datetime values at all. If you don't use them for some other process where the time component is critical perhaps consider replacing them with date values so you need not call the datepart function at all.
The Today and Datepart functions may not take up much time but cumulatively over two million + records it might be noticed.
Since the code on that data step isn't very complex I suspect either disk usage or network bottlenecks are more likely to be eating you time.
i doubt it will affect processing time but you don't need the 'do' and 'end' statements. Is there some reason for it, eg readability or something?
One thing to look at:
if SLA_Date = . then do; SLA_Performance = 'NA'; end; else if SLA_Date >= reporting_date then do; SLA_Performance = 'Within'; end; else do; SLA_Performance = 'Over'; end;
Which of those first two cases occurs most frequently? If you can set it as the first condition then the whole block might runt faster.
Example if SLA_date is missing on only 1percent of the records and SLA_date > reporting_date on 80% of the records reversing the order of comparison would reduce the number times the SLA_date is compared to . a whole bunch.
If you move the line Extract_Date = today(); to before the first If then you could use
reporting_date = Extract_date; instead of the overhead of calling TODAY(twice).
Similar with calling DatePart(Submitted_Date) twice. I might run faster to create a temporary variable by calling datepart once and using that value in the INTNX calls.
Another consideration may be questioning the datetime values at all. If you don't use them for some other process where the time component is critical perhaps consider replacing them with date values so you need not call the datepart function at all.
The Today and Datepart functions may not take up much time but cumulatively over two million + records it might be noticed.
Since the code on that data step isn't very complex I suspect either disk usage or network bottlenecks are more likely to be eating you time.
SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.
Ready to level-up your skills? Choose your own adventure.