Community talk about Meta!

Bulkload with Redshift

Posts: 45

Bulkload with Redshift

I am trying to load the data to Redshift with bulkload. I have the required information but only change is we are using IAM Role with S3 buckets so no Access Key and Secret Key is being used. 


As per SAS bulkload requirements, it seems it needs an access key and secret key. Anyone out there have tried and load the data to Redshift with bulkload using IAM role or without access key and secrete key?


Here is the sample query that I am using. In the error it complains about the region, region is correct. I am sure it has issues with the IAM role since it is expecting a key.


data awswrite.myclass4(







   set sashelp.class;



ERROR: Message from TKS3: Unknown region.

NOTE: The DATA step has been abnormally terminated.

NOTE: The SAS System stopped processing this step because of errors.

NOTE: There were 1 observations read from the data set SASHELP.CLASS.

WARNING: The data set AWSWRITE.MYCLASS4 may be incomplete.  When this step was stopped there were 0 observations and 5 variables.



Total SQL prepare seconds were:                     0.003308

Total seconds used by the REDSHIFT ACCESS engine were     0.097987


ERROR: ROLLBACK issued due to errors for data set AWSWRITE.MYCLASS4.DATA.

NOTE: DATA statement used (Total process time):​


Posts: 45

Re: Bulkload with Redshift

Just learned from SAS TS, IAM role is not supported yet and SAS 9.4 M4 doesn't support the secondary bucket in bulkload. No sub directory should be parent directory only. So in my example it should be simply as shown below.



Ask a Question
Discussion stats
  • 1 reply
  • 1 in conversation