Hi Everyone,
I am trying to use the SAS/Connect to snowflake for bulkload from SAS to snowflake db.
i am on SAS 9.4 M7
My problem is i cant find how to define server side encryption on my bulkload options (SSE-S3) i.e. AES-256
If i was to directly copy a file to S3 using AWS CLI the command i run is
aws s3 cp /localpath/file s3://bucket/file --sse AES256
but when i try to use the bulkoad option i get access denied error
ERROR: Message from TKS3: Access Denied
proc append base=snow.snowflake_table(bulkload=yes
bl_bucket="/mybucket"
) data=work.sasDataset force;
run;
I even tried using proc s3, i can list the objects in the bucket, but when i try to use PUT to transfer a file from local to S3 it fails again with access denied error. But listing the bucket contents using proc S3 works.
proc s3
;
put "/local/file.csv" "/bucket/file.csv";
list "/bucket";
run;
If anyone has any suggestions on how i can get the encryption to work or whether i need to change something in my approach please do guide me 🙂.
s3 is an object store and not a database so it's certainly Proc S3 and not bulk-loading that you need to use.
Does your config file already have all the necessary information? Consult the documentation here and also what @Tom wrote here.
Addition
I've missed that you're actually trying to load into Snowflake so guess S3 is only here because that's normally the staging area. So for loading into Snowflake you can of course use bulk-load and according to documentation here if using bl_internal_stage= you don't need to stage in S3.
@Patrick ,
My knowledge on Snowflake is limited right now. Can a S3 bucket be used to create a snowflake internal stage?
Yes you are right my main objective is to bulkload into snowflake.
thats when i got the error first and to debug the issue i tried using proc S3 to transfer a file to S3.
I think my access is correct because i can use proc s3 to list the bucket items. and was
also able to use aws cli to transfer a file from the SAS server to the S3 bucket.
The only challenge was i had to define SSE option in the cli COPY statement, and i couldnt find how to define that for proc S3 or Snowflake Bulkload.
@SASKiwi My Libname is as below, please note that my snowflake connection details are part of the odbc dsn.
libname snow sasiosnf dsn='MySnowDSN' schema="Dev_Schema" ;
proc append base=snow.MYSNOWFLAKETABLE (bulkload=yes
bl_bucket="mybuket"
) data=work.large_data force;
run;
libname snow clear;
Does your LIBNAME work at all? Does it assign successfully?
If it is successful, try PROC CONTENTS to see if it lists any tables.
If this is all working OK, then try the BL_USE_SSL = YES dataset option
Looks like you have used a LIBNAME SNOW to assign the Snowflake data connection. Can you post that please?
@SASKiwi
Yes the libname works because i am able to append to the snowflake table if i dont use the bulkload=yes option.
I Tried the SSL option as well but still get the same ERROR: Message from TKS3: Access Denied
I shall try to reach out to SAS Support , something could be wrong with the user permissions.
@NN - Yes, that's the best approach. I checked the Snowflake documentation and couldn't find anything else relevant. I'm interested as we might be using Snowflake ourselves in the near future.
Registration is now open for SAS Innovate 2025 , our biggest and most exciting global event of the year! Join us in Orlando, FL, May 6-9.
Sign up by Dec. 31 to get the 2024 rate of just $495.
Register now!
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.
Ready to level-up your skills? Choose your own adventure.