BookmarkSubscribeRSS Feed
☑ This topic is solved. Need further help from the community? Please sign in and ask a new question.
samanvi
Obsidian | Level 7
HI Every one, Need your help. We are pulling data from sqlserver and loading data into snowflake table. We are using proc append with force option but during the process we are getting CLI error and the job is getting failed. Here we are dealing with 60Mill records. I used bulk option too throwing some error on amount column seeing character data "ABC".
1 ACCEPTED SOLUTION

Accepted Solutions
samanvi
Obsidian | Level 7

Thank you for all your help. We analyzed almost 59mill records and found some specials char which is coming at end of data  and we are trying to insert into one of the column due to this  getting error. we suppressed that and data is loading into the table.

 

But loading of this 59Mill is taking a lot of time. for every 1 Mill it's taking 20 min. We have 55 columns only.

View solution in original post

21 REPLIES 21
Patrick
Opal | Level 21

Sounds like a data type mismatch but you will have to provide much more information for us to be of any help.

If this is just a replication of SQL server tables to Snowflake without any transformations then also consider if you could run a process that doesn't pass the data through SAS but directly loads from SQL Server to Snowflake. You could still use SAS to trigger the process. Such a direct load should perform better but you would of course need all the direct connectivity between SQL Server and Snowflake established.

 4 Methods to Transfer Data from SQL Server to Snowflake

samanvi
Obsidian | Level 7

Thank for your response. There are other process even running in the code where we pull data from multiple files do data cleansing and loading them to the snowflake tables. Those are not causing any issue but this process we are creating a key column to the orginal data and adding data to the snowflake table . 

samanvi
Obsidian | Level 7

forgot to add code

{code}

libname abc sql server uid-xxx pwd=xxx;

libname def sasiosnf insertbuffer=5000 server ='' uid='' pwd'';

proc sql;

create table have

as

select monotonic() as key,

  *

from abc.test;

quit;

 

proc append base=def.want data=have force;

run;

 

 

SASKiwi
PROC Star

Please post the full SAS log so we can see notes and errors. I don't see any use of bulk loading.

samanvi
Obsidian | Level 7

error:

samanvi_0-1694744202464.png

 

SASKiwi
PROC Star

We need to see both the source code and the notes and errors together so we can see which statements are causing them.

samanvi
Obsidian | Level 7

Here contains log details. SAS is reading 80k records but why it's considering as nulls even though data is present?

 

samanvi_1-1694778246059.pngsamanvi_2-1694778274979.pngsamanvi_3-1694778339998.png

 

 

SASKiwi
PROC Star

Does your program work with a small amount of data? Try just loading say 1000 rows. If that works try scaling up to 10K, 50K, 100K.

 

Also please don't screenshot your SAS logs, Just do a normal copy and paste using the </> menu option.

LinusH
Tourmaline | Level 20

Another source of information could be the log in Snowsight.

What do you see there?

Data never sleeps
Patrick
Opal | Level 21

Do NOT use monotonic()! This is an undocumented and unsupported function and it will not necessarily return the expected result especially when used with a database table as source.

LinusH
Tourmaline | Level 20

For me, best practice is not to use FORCE.

You should have control of your ETL process, and explicitly tell what data should go where. Don't allow any warnings in your log for production pipelines.

Second, instead of monotonic(), consider using Snowflakes AUTOINCREMENT instead.

https://docs.snowflake.com/en/sql-reference/sql/create-table#syntax

You still haven't share the compete log with your program (including libname and option).

Data never sleeps
Quentin
Super User

Just curious, are you using SAS/ACCESS to Snowflake or perhap's using Snowflake's ODBC driver?

 

Agree with earlier suggestion to try pushing just 5 records (or even 1 record) to snowflake. Are you able to reliably query data from snowflake into SAS?

 

What happens if you try to create a new table in Snowflake, instead of append to an existing table, e.g. 

 

proc append base=abc.NewSnowflakeTable data=AFFIL_1 (obs=5);
run;

The ? in the log don't mean the values are null, so I wouldn't worry about them.

 

Googling the error message for 400 errors from snowflake turns up plenty of hits, e.g. 

https://community.snowflake.com/s/article/Solution-400-Bad-Request-Login-Errors-using-SAML-SSO-Feder...

https://www.googlecloudcommunity.com/gc/Technical-Tips-Tricks/SnowflakeSQLException-JDBC-Driver-enco...

 

The Boston Area SAS Users Group (BASUG) is hosting our in person SAS Blowout on Oct 18!
This full-day event in Cambridge, Mass features four presenters from SAS, presenting on a range of SAS 9 programming topics. Pre-registration by Oct 15 is required.
Full details and registration info at https://www.basug.org/events.
samanvi
Obsidian | Level 7

Thank you for all your help. We analyzed almost 59mill records and found some specials char which is coming at end of data  and we are trying to insert into one of the column due to this  getting error. we suppressed that and data is loading into the table.

 

But loading of this 59Mill is taking a lot of time. for every 1 Mill it's taking 20 min. We have 55 columns only.

Tom
Super User Tom
Super User

Sounds like you have some truncation going on.  Perhaps you are transcoding the data from single byte encoding to multi-byte encoding so the length of the target variables is too short and some mult-byte characters are being truncation causing them to be invalid.

 

for improved transfer use bulk loading.  If your driver does not support it then roll your own by exporting to a delimited file and then using the Snowflake command (COPY FROM?) to bulk load the file.

SAS Innovate 2025: Call for Content

Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!

Submit your idea!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 21 replies
  • 5318 views
  • 2 likes
  • 7 in conversation