Hello,
I am looking to push some datasets lying on SAS studio to Snowflake by staging it in AWS S3. I understand
SAS/ACCESS interface to ODBC or SAS/ACCESS interface to Snowflake would be the options available.
If I use SAS/ACCESS interface to ODBC- then does the admin need to do some configuration in system?
How is the performance to load data into snowflake from SAS and can we leverage bulk loading feature of snowflake with this?
If I use SAS/ACCESS interface to Snowflake then tentatively what would be the additional costs incurred for this product?
And this does support bulk loading into snowflake- please validate my understanding?
Thanks
If you want to write SAS datasets to Snowflake then you need the SAS interface that allows you to connect to Snowflake. That's either SAS/Access to ODBC or SAS/Access to Snowflake. Using ODBC won't allow for bulk loading. You then need of course also an admin to do the config so you can actually connect to your Snowflake instance from SAS.
IF you want to load via S3 then you would write your data to S3. You would need to convert your SAS tables to a format that Snowflake then can read without the need of SAS as the "middle man". Parquet is a common format and if you've got a very current SAS Viya version then you could create such Parget files with SAS - else some other format that SAS can create and Snowflake can read and load.
This SASGF paper should answer most of your questions: 0 An Insider’s Guide to SAS/ACCESS® Interface to Snowflake
I assume your SAS version is 9.4 and not SAS Viya. SAS/ACCESS to Snowflake came out with 9.4M6. If you are on an earlier maintenance level then you would be stuck with SAS/ACCESS to ODBC. You would need to discuss pricing with your SAS account manager
If you configure SAS/ACCESS Interface to ODBC for use with Snowflake it will work, but bulk loading isn't available. SAS/ACCESS Interface to Snowflake is a separate product that has to be installed and licensed.
If you want to push data to an S3 bucket for Snowflake to read you just need access to the S3 bucket to write and from Snowflake to read. You could copy the files using any tool you have. If you want to copy it with SAS code then you could use PROC S3 to copy the files. Or if you have XCMD enabled in your SAS session you could call operating system commands, such as the AWS command line interface, to copy the file(s).
You could then use whatever interface to Snowflake you want to trigger the command to read from the files.
If you want to push the SQL code into Snowflake to read the transferred file from SAS then you can use either SAS/Access to ODBC or SAS/Access to Snowflake. Both will probably require that you install something to get the connection to work.
If you want to write SAS datasets to Snowflake then you need the SAS interface that allows you to connect to Snowflake. That's either SAS/Access to ODBC or SAS/Access to Snowflake. Using ODBC won't allow for bulk loading. You then need of course also an admin to do the config so you can actually connect to your Snowflake instance from SAS.
IF you want to load via S3 then you would write your data to S3. You would need to convert your SAS tables to a format that Snowflake then can read without the need of SAS as the "middle man". Parquet is a common format and if you've got a very current SAS Viya version then you could create such Parget files with SAS - else some other format that SAS can create and Snowflake can read and load.
Thank You @Patrick.
We do not have SAS Viya.WE have SAS/ACCESS interface to sql and ODBC custom version 9.4M_6
Good news: We've extended SAS Hackathon registration until Sept. 12, so you still have time to be part of our biggest event yet – our five-year anniversary!
What’s the difference between SAS Enterprise Guide and SAS Studio? How are they similar? Just ask SAS’ Danny Modlin.
Find more tutorials on the SAS Users YouTube channel.
Ready to level-up your skills? Choose your own adventure.