BookmarkSubscribeRSS Feed
nickspencer
Obsidian | Level 7

I have a sas dataset with around 80k records. I have to join/process this dataset with few teradata tables. But these teradata tables are huge with about 40 million records in each table.
Since I do not have access to create tables in teradata, I am creating volatile table and appending this dataset to volatile table but the problem is, it is taking too long to load/append this dataset to the volatile table. There are about 80k records and 30 columns. It takes about 2 hour in night but when I run in daytime it never completes.
I am using multistmt=yes tpt=no option (fastload/mutliload does not work with volatile table) but it is not helping at all. Please advice on what might be the right approach ?

Thanks,
Nick

11 REPLIES 11
nickspencer
Obsidian | Level 7
Does insertbuff work with teradata ?
SASKiwi
PROC Star

Here is the Teradata-specific doc: https://documentation.sas.com/?cdcId=pgmsascdc&cdcVersion=9.4_3.5&docsetId=acreldb&docsetTarget=p0gy...

 

It looks like INSERTBUFF isn't available but DBCOMMIT definitely is.

nickspencer
Obsidian | Level 7
I tried dbcommit=0 but it has not helped at all. It takes forever to run the append step.
Is there any other way to speed up the append process ?
SASKiwi
PROC Star

Try DBCOMMIT = 1000, 2000, 3000 etc. What method are you using? PROC APPEND?

nickspencer
Obsidian | Level 7
@SASKiwi, I am using the proc append
SASKiwi
PROC Star

@nickspencer  - The DBCOMMIT default for PROC APPEND is 0 so there is no point trying that value. As already stated, try 1,000, 2,000, 3,000 etc to see if it helps.

 

Failing that I suggest trying Tech Support.

Tom
Super User Tom
Super User

Upload the data to a real table and then copy it to the volatile table.

Then you can use FASTLOAD/TPT options.

nickspencer
Obsidian | Level 7
Tom, we do not have the ability to use the real table in production. I tried multistmt=yes tpt=no but it is not helping much. Are there any other ways loading to volatile table can be made faster?
Tom
Super User Tom
Super User

@nickspencer wrote:
Tom, we do not have the ability to use the real table in production. I tried multistmt=yes tpt=no but it is not helping much. Are there any other ways loading to volatile table can be made faster?

Don't load it to THE real table. Load it to A real table.  If you don't have a database you can write into ask your DBA to create one. Either a personal database or a team database.

Something like this:

libname &permlib teradata .... ;
proc append data=&in
  base=&permlib..&permname (fastload=yes dbcreate_table_opts='no primary index')
;
run;
proc sql ;
  connect using &permlib ;
  execute by &permlib
   (create multiset volatile table "&out" as
       (select * from "&permdb"."&permname")
        with data
        on commit preserve rows
   );
  execute by &permlib
    ( drop table "&permdb"."&permname" )
  ;
quit;
ChrisNZ
Tourmaline | Level 20

1. Do you use the ODBC driver or the native SAS/Access product?

2. I think the maximum query length is 1MB, so depending on the key length, maybe just add the 80,000 values to the query to only bring back the records you need.

sas-innovate-2024.png

Available on demand!

Missed SAS Innovate Las Vegas? Watch all the action for free! View the keynotes, general sessions and 22 breakouts on demand.

 

Register now!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 11 replies
  • 3255 views
  • 0 likes
  • 4 in conversation