BookmarkSubscribeRSS Feed
titan31
Quartz | Level 8

Hi,

 

We've a very long running table loader (it usually takes around an hour, and seems to be taking longer today).

 

This is yesterdays log for it

 

NOTE: There were 10553628 observations read from the data set WORK.W37RG31.
NOTE: 10553628 observations added.
NOTE: The data set CMCLMMRT.CLAIM_ITEM has 10553628 observations and 133 variables.
NOTE: PROCEDURE APPEND used (Total process time):
real time 59:27.16
user cpu time 11:58.76
system cpu time 15:48.31
memory 13482411.56k
OS Memory 13501984.00k
Timestamp 22/02/2018 06:02:43 a.m.

 

 

Is that ratio for real time to user cpu time fine, and for writing 10m rows and 133 columns fine, or should this be performing much quicker? If so, what should I be checking to get it to perform faster, as I'm fairly new to a sas administration role so would like to boost performance if I can. 

 

it's reading from a SAS temporary view rather than an external database at this point of the process too. It's already read in earlier in the flow from the external database. 

 

For today, as it's finally finished running, I got

 

NOTE: There were 10564414 observations read from the data set WORK.WRKDLBX.
NOTE: 10564414 observations added.
NOTE: The data set CMCLMMRT.CLAIM_ITEM has 10564414 observations and 133 variables.
NOTE: PROCEDURE APPEND used (Total process time):
real time 1:51:50.98
user cpu time 12:36.75
system cpu time 27:33.04
memory 13480786.12k
OS Memory 13505000.00k
Timestamp 23/02/2018 11:46:38 a.m.
Step Count 85 Switch Count 1

2 REPLIES 2
LinusH
Tourmaline | Level 20

Optimizing is a huge subject, so it's hard to tell exactly what is going on, and what to do in this case, with limited information.

So the source data is in an external database? Then the first view hitting the data base should comply with the restrictions of implicit pass through.

That view, and all other views in a job should be temporarily switched to tables in the performance testing process, so you can find the bottleneck.

Real vs cpu time: assuming target table is in SAS (if in external database - you can't compare the measure since SAS have no clue about CPU in external database).

Indexing on target table can make append go slower. If so, look into use SPDE since it's much more efficient in index updates.

Data never sleeps
titan31
Quartz | Level 8

Hi,

 

Original source is an external DB (Oracle), but we've it converted into physical SAS tables at through our process and data warehousing jobs. When we're creating a datamart from it to be used in VA (joining a bunch of physical tables in SAS together in this step), we seem to be hitting a bottleneck. Target table is in SAS too as we're using it for autoloading into VA. 

sas-innovate-2024.png

Don't miss out on SAS Innovate - Register now for the FREE Livestream!

Can't make it to Vegas? No problem! Watch our general sessions LIVE or on-demand starting April 17th. Hear from SAS execs, best-selling author Adam Grant, Hot Ones host Sean Evans, top tech journalist Kara Swisher, AI expert Cassie Kozyrkov, and the mind-blowing dance crew iLuminate! Plus, get access to over 20 breakout sessions.

 

Register now!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 2 replies
  • 1690 views
  • 0 likes
  • 2 in conversation