Thank you for all your help. We analyzed almost 59mill records and found some specials char which is coming at end of data and we are trying to insert into one of the column due to this getting error. we suppressed that and data is loading into the table.
But loading of this 59Mill is taking a lot of time. for every 1 Mill it's taking 20 min. We have 55 columns only.
Sounds like a data type mismatch but you will have to provide much more information for us to be of any help.
If this is just a replication of SQL server tables to Snowflake without any transformations then also consider if you could run a process that doesn't pass the data through SAS but directly loads from SQL Server to Snowflake. You could still use SAS to trigger the process. Such a direct load should perform better but you would of course need all the direct connectivity between SQL Server and Snowflake established.
Thank for your response. There are other process even running in the code where we pull data from multiple files do data cleansing and loading them to the snowflake tables. Those are not causing any issue but this process we are creating a key column to the orginal data and adding data to the snowflake table .
forgot to add code
{code}
libname abc sql server uid-xxx pwd=xxx;
libname def sasiosnf insertbuffer=5000 server ='' uid='' pwd'';
proc sql;
create table have
as
select monotonic() as key,
*
from abc.test;
quit;
proc append base=def.want data=have force;
run;
Please post the full SAS log so we can see notes and errors. I don't see any use of bulk loading.
error:
We need to see both the source code and the notes and errors together so we can see which statements are causing them.
Here contains log details. SAS is reading 80k records but why it's considering as nulls even though data is present?
Does your program work with a small amount of data? Try just loading say 1000 rows. If that works try scaling up to 10K, 50K, 100K.
Also please don't screenshot your SAS logs, Just do a normal copy and paste using the </> menu option.
Another source of information could be the log in Snowsight.
What do you see there?
Do NOT use monotonic()! This is an undocumented and unsupported function and it will not necessarily return the expected result especially when used with a database table as source.
For me, best practice is not to use FORCE.
You should have control of your ETL process, and explicitly tell what data should go where. Don't allow any warnings in your log for production pipelines.
Second, instead of monotonic(), consider using Snowflakes AUTOINCREMENT instead.
https://docs.snowflake.com/en/sql-reference/sql/create-table#syntax
You still haven't share the compete log with your program (including libname and option).
Just curious, are you using SAS/ACCESS to Snowflake or perhap's using Snowflake's ODBC driver?
Agree with earlier suggestion to try pushing just 5 records (or even 1 record) to snowflake. Are you able to reliably query data from snowflake into SAS?
What happens if you try to create a new table in Snowflake, instead of append to an existing table, e.g.
proc append base=abc.NewSnowflakeTable data=AFFIL_1 (obs=5);
run;
The ? in the log don't mean the values are null, so I wouldn't worry about them.
Googling the error message for 400 errors from snowflake turns up plenty of hits, e.g.
Thank you for all your help. We analyzed almost 59mill records and found some specials char which is coming at end of data and we are trying to insert into one of the column due to this getting error. we suppressed that and data is loading into the table.
But loading of this 59Mill is taking a lot of time. for every 1 Mill it's taking 20 min. We have 55 columns only.
Sounds like you have some truncation going on. Perhaps you are transcoding the data from single byte encoding to multi-byte encoding so the length of the target variables is too short and some mult-byte characters are being truncation causing them to be invalid.
for improved transfer use bulk loading. If your driver does not support it then roll your own by exporting to a delimited file and then using the Snowflake command (COPY FROM?) to bulk load the file.
Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!
Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.
Find more tutorials on the SAS Users YouTube channel.