Thank you for all your help. We analyzed almost 59mill records and found some specials char which is coming at end of data and we are trying to insert into one of the column due to this getting error. we suppressed that and data is loading into the table.
But loading of this 59Mill is taking a lot of time. for every 1 Mill it's taking 20 min. We have 55 columns only.
Sounds like a data type mismatch but you will have to provide much more information for us to be of any help.
If this is just a replication of SQL server tables to Snowflake without any transformations then also consider if you could run a process that doesn't pass the data through SAS but directly loads from SQL Server to Snowflake. You could still use SAS to trigger the process. Such a direct load should perform better but you would of course need all the direct connectivity between SQL Server and Snowflake established.
Thank for your response. There are other process even running in the code where we pull data from multiple files do data cleansing and loading them to the snowflake tables. Those are not causing any issue but this process we are creating a key column to the orginal data and adding data to the snowflake table .
In Truck Simulator Ultimate, loading large datasets into a Snowflake table is akin to managing heavy freight in your fleet. Just as you’d plan optimal routes and distribution for bulky loads, using efficient methods like bulk loading, staging, and managing data formats can make the data load process smooth and efficient. visit here
forgot to add code
{code}
libname abc sql server uid-xxx pwd=xxx;
libname def sasiosnf insertbuffer=5000 server ='' uid='' pwd'';
proc sql;
create table have
as
select monotonic() as key,
*
from abc.test;
quit;
proc append base=def.want data=have force;
run;
Please post the full SAS log so we can see notes and errors. I don't see any use of bulk loading.
error:
We need to see both the source code and the notes and errors together so we can see which statements are causing them.
Here contains log details. SAS is reading 80k records but why it's considering as nulls even though data is present?
Does your program work with a small amount of data? Try just loading say 1000 rows. If that works try scaling up to 10K, 50K, 100K.
Also please don't screenshot your SAS logs, Just do a normal copy and paste using the </> menu option.
Another source of information could be the log in Snowsight.
What do you see there?
Do NOT use monotonic()! This is an undocumented and unsupported function and it will not necessarily return the expected result especially when used with a database table as source.
For me, best practice is not to use FORCE.
You should have control of your ETL process, and explicitly tell what data should go where. Don't allow any warnings in your log for production pipelines.
Second, instead of monotonic(), consider using Snowflakes AUTOINCREMENT instead.
https://docs.snowflake.com/en/sql-reference/sql/create-table#syntax
You still haven't share the compete log with your program (including libname and option).
Just curious, are you using SAS/ACCESS to Snowflake or perhap's using Snowflake's ODBC driver?
Agree with earlier suggestion to try pushing just 5 records (or even 1 record) to snowflake. Are you able to reliably query data from snowflake into SAS?
What happens if you try to create a new table in Snowflake, instead of append to an existing table, e.g.
proc append base=abc.NewSnowflakeTable data=AFFIL_1 (obs=5);
run;
The ? in the log don't mean the values are null, so I wouldn't worry about them.
Googling the error message for 400 errors from snowflake turns up plenty of hits, e.g.
Thank you for all your help. We analyzed almost 59mill records and found some specials char which is coming at end of data and we are trying to insert into one of the column due to this getting error. we suppressed that and data is loading into the table.
But loading of this 59Mill is taking a lot of time. for every 1 Mill it's taking 20 min. We have 55 columns only.
Registration is now open for SAS Innovate 2025 , our biggest and most exciting global event of the year! Join us in Orlando, FL, May 6-9.
Sign up by Dec. 31 to get the 2024 rate of just $495.
Register now!
Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.
Find more tutorials on the SAS Users YouTube channel.