Hi Team,
I'm not able to load 35 crore observations into SQL in one shot, instead I'm splitting the observation into 4 part and loading, is there anyway to load huge amount of observation(35 crore) into SQL in one shot.....
I'm using below code to load data
libname sql_test odbc dsn=xxx_test user="xx" pass="xx" schema=xx;
proc sql;
connect to odbc (dsn=xxx_test user=xx pwd=xx);
execute(truncate table TeradataDemand) by odbc;
insert into sql_test.TeradataDemand select * from First_ETL;
disconnect from odbc;
quit ;
Have you tried the options here? Bulkload and
https://support.sas.com/resources/papers/proceedings11/103-2011.pdf
FYI - I'm moving your question to the data management forum.
Sorry, my mistake. You're using SQL Pass through which means you're really working on a Teradata problem not a SAS problem. You may want to consider posting this in a Teradata forum instead. SAS is literally just passing the command to Teradata here, you need the appropriate Teradata options to specify to speed this up.
I suggest you try out the Fastload and Multiload Teradata loading options documented here: https://documentation.sas.com/?docsetId=acreldb&docsetTarget=n0ht8i7t92tocpn18vn0krftm85m.htm&docset...
@SASKiwi do those apply when it's a solely DB operation? It makes sense that they would apply when loading SAS data to Teradata but that doesn't appear to be the case here, the code is entirely a teradata command unless I'm missing something which is entirely possible.
@Reeza - Yes, I agree. The way the EXECUTE is structured here it is purely a DB operation. In fact the code does not appear to include any step uploading SAS data into Teradata. @Sharath_naik needs to explain where the data is coming from because he is just referencing a Teradata temporary table and that is it. So the way the program is written it is purely a Teradata issue for which he should get the answers from Teradata forums.
The link I supplied may be helpful if the data is originally from SAS.
@Sharath_naik - Thanks for clarifying your load process. So is your ODBC LIBNAME pointing to an Oracle database then? If so then the BULKLOAD and DBCOMMIT options are worth trying: https://documentation.sas.com/?docsetId=acreldb&docsetTarget=n0qddeciyqg3qfn1tosrb42y47os.htm&docset...
I suggest you start by trying DBCOMMIT = 1,000, then 10,000, 20,000 and so on. Experiment on a subset of your data to speed up the testing - 350 million rows is an awful lot (now I know a crore is 10 million ).
I believe the code @Sharath_naik posted is real.
libname sql_test odbc dsn=xxx_test user="xx" pass="xx" schema=xx;
proc sql;
connect to odbc (dsn=xxx_test user=xx pwd=xx);
execute(truncate table TeradataDemand) by odbc;
insert into sql_test.TeradataDemand select * from First_ETL;
disconnect from odbc;
quit
The explicit pass-through SQL execute() bit only truncates the table. The insert bit is implicit pass-through SQL and it appears the source table is a SAS WORK table.
If above is correct then ideally you'd be using SAS/Access to Teradata and not to ODBC (if licensed). The native Teradata access module would allow you to use stuff like FASTLOAD.
I'm not sure what's available with ODBC when loading into Teradata. You will have to investigate starting here. It's certainly worth to set options INSERTBUFF and DBCOMMIT to something else than the default (I normally set DBCOMMIT to 0 so you only commit once at the end - a all or nothing approach). If it was me then I'd also use PROC APPEND and not a SQL INSERT for loading a SAS table into a Data base.
Join us for SAS Innovate 2025, our biggest and most exciting global event of the year, in Orlando, FL, from May 6-9. Sign up by March 14 for just $795.
Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.
Find more tutorials on the SAS Users YouTube channel.