BookmarkSubscribeRSS Feed
Fluorite | Level 6


Kindly guide me on the following?

We want to generate a certain KPI's that consists of 67 columns (that requires combining 14 different tables and 34 calculated columns from database). We also want to monitor performance wrt generating the report.

Kindly help which approach is better?

Creating calculated columns in Enterprise guide or Creating calculated columns in Visual analytics.

I understand that these calculated columns will be one time activity. But with respect to performance, Which would be better?




Obsidian | Level 7

If I were you, I would have joined those tables first to get the required columns for reporting. Sometimes creating calculated items in denormalized wide table is better than to creating them in VA report because they will be readily available in dataset for multiple reports and you don't have to recreate them (that could increase the chances of errors or inconsistency) .. So you could join the tables in EG first and then use the final table in VA report designer ..


for performance, in case you create your calculated columns in VA, depending upon the size of the dataset, it shouldn't be a problem in VA as the data resides in memory and calculations are faster enough to become noticeable ..



Fluorite | Level 6

Hi Ali_jooan,

Thanks for your reply. There are some calculated columns that needs to be created beforehand, and the user again creates a new calculated output for their KPI's based on the combination of certain columns and previously created calculated columns.

Kindly guide

if i should create all my calculated columns in VA ? 

Will all the calculated columns that we create be available for all reports in VA?

lastly, Would there be a drastic difference in performance if i create the calculated columns in VA vs i create them in enterprise guide.


Currently we are undergoing POC to identify best practice. Our final data for reporting requires creating lots of calculated columns, some reports are static and some are dynamic in nature.




Fluorite | Level 6

Our current dataset is around 2M rows and 80 columns with incremental data change of 15% monthly. Kindly guide

Obsidian | Level 7

Ok ..


So as you said some calculated items are created in the final dataset and some others are created by VA report developer. I would suggest whatever calculated variables that are not report specific and can potentially be utilized by other reports, create them in the dataset. And obviously there are some calculated items which are report specific are created by the developer ..


So the calculated items created in VA for one report, will not be available in other reports and you will have to recreate them.


If you join the 13 tables and 60 columns in EG, an efficient query should get you the results in reasonable time.


2 million rows and 80 columns from VA perspective is not going to create a performance issue (completely unaware of your hardware sizing and available resources, it could be an issue if resources are already utilized to their capacity ..) ..




Fluorite | Level 6
Hi Ali_Joaan,
Will do as per suggested and will update.
Thanks Again for you help. 🙂



Registration is open! SAS is returning to Vegas for an AI and analytics experience like no other! Whether you're an executive, manager, end user or SAS partner, SAS Innovate is designed for everyone on your team. Register for just $495 by 12/31/2023.

If you are interested in speaking, there is still time to submit a session idea. More details are posted on the website. 

Register now!

Tips for filtering data sources in SAS Visual Analytics

See how to use one filter for multiple data sources by mapping your data from SAS’ Alexandria McCall.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 5 replies
  • 2 in conversation