Hi all,
In the past, the following code has worked for me in moving a data set containing 14 million rows into Teradata without a hiccup.
%put *** rchtera;
%dbcon(rchtera);
%libdbcon(tera,rchtera, dsw_raroc);
proc sql;
create table tera.LMA_1712 (fastload=yes sessions=4 tenacity=5 sleep=5 dbcommit=100000) as
select *
from data.LMA_1712_1;
quit;
However this happened when I tried to do the same with another dataset I have:
ERROR: Error attempting to CREATE a DBMS table. ERROR: Teradata execute: The Maximum Possible Row Length in the Table is too Large.. So I divided my data into smaller datasets yet still no luck. Even cutting down the number of rows and dropping columns I don't need, I still get this error even when USCF_raw contains less than 600 rows. What can I do to get over this?
proc sql;
create table tera.USCF_acclv_CO_Rcv (fastload=yes dbcommit=10000) as
select *
from myQRE.USCF_raw;
quit;
Thanks again for any input.
The message is about row length, so splitting the table while keeping the same row length is not helping.
Have a look at your variables to see what could be causing the issue as @kiranv_ said. Can you shorten or remove some of them?
if you teradata sql assistant check whether you character column which is very long. do a describe table in SAS for teradta table using libname or do a show table in Teradata SQL assistant.
The message is about row length, so splitting the table while keeping the same row length is not helping.
Have a look at your variables to see what could be causing the issue as @kiranv_ said. Can you shorten or remove some of them?
Join us for SAS Innovate 2025, our biggest and most exciting global event of the year, in Orlando, FL, from May 6-9. Sign up by March 14 for just $795.
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.
Ready to level-up your skills? Choose your own adventure.