Hello @Stephanvd
Performance is a broad subject and I typically survey three things:
Often slow reports have many pages and each page has a high number of chart object. Are all the objects truly necessary? If there's many pages (+6) then can you split the report into two or more reports and create links between reports?
Design performance i.e. amount of data manipulation actions done on presentation layer.
Interactions, filters, display rules, custom calculations, hierarchies, parameters, geo maps, custom sort and custom categories all trigger data manipulation actions which means more time is spend. Considering performance it's a good idea to try to minimize the number of actions that need to be done to the data on the report. Is there calculations and row-level transformations that you can move to the data preparation layer? You could also try creating separate tables for control objects, such as Drop-down box and List object, and then Join the data on the fly on the report. If you have a long table then for every Drop-down box it needs to go through the whole column to get the distinct category values. Producing a separate short table for control objects takes this burden away and even though it now has to perform a Join on the report it's often faster than using a single table especially if the table is long.
Server-side performance i.e. structure, type and size of the source data.
Eliminate any unnecessary columns from source data on the LASR server. Reduce character field lengths and
decimal places in numeric data if possible. Avoid using compressed data tables, views and star schemas if there is performance issues. Also each server has it's own limits mostly based on the number of servers, CPUs and memory. Generally it's a very good thing to be aware of the server limits and f.ex. not exceeding the recommended max amount of data that can be loaded into SAS plus not exceeding the recommended max size of the largest individual data table to be loaded into SAS. Also number of heavy and light users are important.
I would say making separate tables might be a very good thing for performance. On the negative side it will make the overall administration a little bit more burdensome and increase memory consumption on server.
Best regards,
Petri
I've improved performance by using source filters to reduce the amount of data needing refreshing.
As I understand it without using source filters to remove information you don't need in your actual report then in report filters have to refresh the whole dataset each time one is used and the entire dataset is loaded into memory.
By using a source filter then only the data that is filtered to is loaded into memory and refreshed.
I produce regional reports from a national dataset so by using a source filter to only filter our regions data I effectively remove 94% of the information from the dataset (Our region is 6% of the national picture). This means only our regions data is loaded at the outset and only that is refreshed when in report filters are used.
Consequently my regional reports run much faster than any of the nationally built reports where you need to filter your region. This not only helps with performance but encourages (and has the effect of) increased usage as people are less frustrated with both the slowness and having to use filters to get to our(their) regions information.
Hello @Stephanvd
Performance is a broad subject and I typically survey three things:
Often slow reports have many pages and each page has a high number of chart object. Are all the objects truly necessary? If there's many pages (+6) then can you split the report into two or more reports and create links between reports?
Design performance i.e. amount of data manipulation actions done on presentation layer.
Interactions, filters, display rules, custom calculations, hierarchies, parameters, geo maps, custom sort and custom categories all trigger data manipulation actions which means more time is spend. Considering performance it's a good idea to try to minimize the number of actions that need to be done to the data on the report. Is there calculations and row-level transformations that you can move to the data preparation layer? You could also try creating separate tables for control objects, such as Drop-down box and List object, and then Join the data on the fly on the report. If you have a long table then for every Drop-down box it needs to go through the whole column to get the distinct category values. Producing a separate short table for control objects takes this burden away and even though it now has to perform a Join on the report it's often faster than using a single table especially if the table is long.
Server-side performance i.e. structure, type and size of the source data.
Eliminate any unnecessary columns from source data on the LASR server. Reduce character field lengths and
decimal places in numeric data if possible. Avoid using compressed data tables, views and star schemas if there is performance issues. Also each server has it's own limits mostly based on the number of servers, CPUs and memory. Generally it's a very good thing to be aware of the server limits and f.ex. not exceeding the recommended max amount of data that can be loaded into SAS plus not exceeding the recommended max size of the largest individual data table to be loaded into SAS. Also number of heavy and light users are important.
I would say making separate tables might be a very good thing for performance. On the negative side it will make the overall administration a little bit more burdensome and increase memory consumption on server.
Best regards,
Petri
Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!
See how to use one filter for multiple data sources by mapping your data from SAS’ Alexandria McCall.
Find more tutorials on the SAS Users YouTube channel.