BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
Tom
Super User Tom
Super User

Has any one tried to use SAS with https://www.snowflake.net/ cloud data warehousing service?

 

For now I was going to begin by trying to access it via ODBC driver.

Do we know if SAS is working on any integration tools for Snowflake to allow pushing computing into the snowflake engine?

 

1 ACCEPTED SOLUTION

Accepted Solutions
JBailey
Barite | Level 11

Hi @Tom,

 

Exciting news! SAS is developing SAS/ACCESS Interface to Snowflake; expected release date is sometime 3Q 2019. In the meantime, use SAS/ACCESS Interface to ODBC or JDBC and, the Snowflake supplied drivers.

 

There are no current plans to create a SAS/ACCESS product specifically for Snowflake. If there is an ODBC 3.x compliant Snowflake driver then SAS/ACCESS Interface to ODBC is a good choice.

 

The following LIBNAME statement options will help increase performance:

INSERTBUFF= Number of rows in an insert statement. Play around with the number. Bigger does not always mean better (start in the 100 plus range and be ready to experiment)

READBUFF=  You definitely want to set this. Play around with the number (start in the 100 plus range and be ready to experiment).

DBCOMMIT= May help. I haven't tried this one. 

 

If you are performing CREATE TABLE AS processing you will absolutely want to use DBIDIRECTEXEC. Ditto INSERT AS SELECT, UPDATE, and DELETE. You can read about it here.

 

If you have to move large amounts of data (5GB plus) it may be a good idea to use PROC S3 to move the file to AWS and then using the Snowflake LOAD command. This requires creating a JSON file and some explicit pass-through but it could be worth it. I haven't done this but I plan to try it once my Snowflake account is unlocked. It appears that Jeff can't remember his password. 

 

I have run some performance numbers on this but can't find them. I will rerun and post an article in the SAS Communities library or an external blog.

 

Best wishes,

Jeff

View solution in original post

4 REPLIES 4
curaloco
Fluorite | Level 6

Hi Tom, we are also starting to use Snowflake cloud data warehousing service, would you be so kind to share how you are connecting?

 

Thanks

Mauricio

JBailey
Barite | Level 11

Hi @Tom,

 

Exciting news! SAS is developing SAS/ACCESS Interface to Snowflake; expected release date is sometime 3Q 2019. In the meantime, use SAS/ACCESS Interface to ODBC or JDBC and, the Snowflake supplied drivers.

 

There are no current plans to create a SAS/ACCESS product specifically for Snowflake. If there is an ODBC 3.x compliant Snowflake driver then SAS/ACCESS Interface to ODBC is a good choice.

 

The following LIBNAME statement options will help increase performance:

INSERTBUFF= Number of rows in an insert statement. Play around with the number. Bigger does not always mean better (start in the 100 plus range and be ready to experiment)

READBUFF=  You definitely want to set this. Play around with the number (start in the 100 plus range and be ready to experiment).

DBCOMMIT= May help. I haven't tried this one. 

 

If you are performing CREATE TABLE AS processing you will absolutely want to use DBIDIRECTEXEC. Ditto INSERT AS SELECT, UPDATE, and DELETE. You can read about it here.

 

If you have to move large amounts of data (5GB plus) it may be a good idea to use PROC S3 to move the file to AWS and then using the Snowflake LOAD command. This requires creating a JSON file and some explicit pass-through but it could be worth it. I haven't done this but I plan to try it once my Snowflake account is unlocked. It appears that Jeff can't remember his password. 

 

I have run some performance numbers on this but can't find them. I will rerun and post an article in the SAS Communities library or an external blog.

 

Best wishes,

Jeff

curaloco
Fluorite | Level 6

Hi @JBailey

 

Opening a Snowflake table in SAS Enterprise Guide 7.15 takes a really long time, Character variable length in Snowflake seems to be the problem as it shows that they are created by default with a length of VARCHAR(16777216). 

We have tried creating a SAS view which solves the speed problem, but it demands a manually intensive process to determine the right length of each character variable and it only works after the Snowflake table has been completely scanned which could take a long time. Are there any other approaches or configuration changes to the ODBC parameters that could help? 

JBailey
Barite | Level 11

Hi @curaloco

 

If you don't mind, can you open a new topic because this one has be solved and we don't want to overload it. Be sure to include my user name in the post. That way I will get a notification when the topic shows up.

 

Best wishes,
Jeff

SAS Innovate 2025: Register Now

Registration is now open for SAS Innovate 2025 , our biggest and most exciting global event of the year! Join us in Orlando, FL, May 6-9.
Sign up by Dec. 31 to get the 2024 rate of just $495.
Register now!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 4 replies
  • 4592 views
  • 7 likes
  • 3 in conversation