- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I am looking to push some datasets lying on SAS studio to Snowflake by staging it in AWS S3. I understand
SAS/ACCESS interface to ODBC or SAS/ACCESS interface to Snowflake would be the options available.
If I use SAS/ACCESS interface to ODBC- then does the admin need to do some configuration in system?
How is the performance to load data into snowflake from SAS and can we leverage bulk loading feature of snowflake with this?
If I use SAS/ACCESS interface to Snowflake then tentatively what would be the additional costs incurred for this product?
And this does support bulk loading into snowflake- please validate my understanding?
Thanks
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
If you want to write SAS datasets to Snowflake then you need the SAS interface that allows you to connect to Snowflake. That's either SAS/Access to ODBC or SAS/Access to Snowflake. Using ODBC won't allow for bulk loading. You then need of course also an admin to do the config so you can actually connect to your Snowflake instance from SAS.
IF you want to load via S3 then you would write your data to S3. You would need to convert your SAS tables to a format that Snowflake then can read without the need of SAS as the "middle man". Parquet is a common format and if you've got a very current SAS Viya version then you could create such Parget files with SAS - else some other format that SAS can create and Snowflake can read and load.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
This SASGF paper should answer most of your questions: 0 An Insider’s Guide to SAS/ACCESS® Interface to Snowflake
I assume your SAS version is 9.4 and not SAS Viya. SAS/ACCESS to Snowflake came out with 9.4M6. If you are on an earlier maintenance level then you would be stuck with SAS/ACCESS to ODBC. You would need to discuss pricing with your SAS account manager
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
We have SAS/ACCESS interface to ODBC and MySQL ver 9.4_M6.
So this supports SAS/ACCESS int. to Snowflake too right?
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
If you configure SAS/ACCESS Interface to ODBC for use with Snowflake it will work, but bulk loading isn't available. SAS/ACCESS Interface to Snowflake is a separate product that has to be installed and licensed.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
If you want to push data to an S3 bucket for Snowflake to read you just need access to the S3 bucket to write and from Snowflake to read. You could copy the files using any tool you have. If you want to copy it with SAS code then you could use PROC S3 to copy the files. Or if you have XCMD enabled in your SAS session you could call operating system commands, such as the AWS command line interface, to copy the file(s).
You could then use whatever interface to Snowflake you want to trigger the command to read from the files.
If you want to push the SQL code into Snowflake to read the transferred file from SAS then you can use either SAS/Access to ODBC or SAS/Access to Snowflake. Both will probably require that you install something to get the connection to work.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
to use proc S3 I may need some IAM role or secret key I guess.
Yes, I want to store the data in Snowflake.
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
If you want to write SAS datasets to Snowflake then you need the SAS interface that allows you to connect to Snowflake. That's either SAS/Access to ODBC or SAS/Access to Snowflake. Using ODBC won't allow for bulk loading. You then need of course also an admin to do the config so you can actually connect to your Snowflake instance from SAS.
IF you want to load via S3 then you would write your data to S3. You would need to convert your SAS tables to a format that Snowflake then can read without the need of SAS as the "middle man". Parquet is a common format and if you've got a very current SAS Viya version then you could create such Parget files with SAS - else some other format that SAS can create and Snowflake can read and load.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Thank You @Patrick.
We do not have SAS Viya.WE have SAS/ACCESS interface to sql and ODBC custom version 9.4M_6