11-01-2017 09:18 AM
Kindly guide me on the following?
We want to generate a certain KPI's that consists of 67 columns (that requires combining 14 different tables and 34 calculated columns from database). We also want to monitor performance wrt generating the report.
Kindly help which approach is better?
Creating calculated columns in Enterprise guide or Creating calculated columns in Visual analytics.
I understand that these calculated columns will be one time activity. But with respect to performance, Which would be better?
11-02-2017 11:27 PM
If I were you, I would have joined those tables first to get the required columns for reporting. Sometimes creating calculated items in denormalized wide table is better than to creating them in VA report because they will be readily available in dataset for multiple reports and you don't have to recreate them (that could increase the chances of errors or inconsistency) .. So you could join the tables in EG first and then use the final table in VA report designer ..
for performance, in case you create your calculated columns in VA, depending upon the size of the dataset, it shouldn't be a problem in VA as the data resides in memory and calculations are faster enough to become noticeable ..
11-03-2017 02:18 AM
Thanks for your reply. There are some calculated columns that needs to be created beforehand, and the user again creates a new calculated output for their KPI's based on the combination of certain columns and previously created calculated columns.
if i should create all my calculated columns in VA ?
Will all the calculated columns that we create be available for all reports in VA?
lastly, Would there be a drastic difference in performance if i create the calculated columns in VA vs i create them in enterprise guide.
Currently we are undergoing POC to identify best practice. Our final data for reporting requires creating lots of calculated columns, some reports are static and some are dynamic in nature.
11-03-2017 02:33 AM
So as you said some calculated items are created in the final dataset and some others are created by VA report developer. I would suggest whatever calculated variables that are not report specific and can potentially be utilized by other reports, create them in the dataset. And obviously there are some calculated items which are report specific are created by the developer ..
So the calculated items created in VA for one report, will not be available in other reports and you will have to recreate them.
If you join the 13 tables and 60 columns in EG, an efficient query should get you the results in reasonable time.
2 million rows and 80 columns from VA perspective is not going to create a performance issue (completely unaware of your hardware sizing and available resources, it could be an issue if resources are already utilized to their capacity ..) ..