Community talk about communities.sas.com. Meta!

Bulkload with Redshift

Reply
Contributor
Posts: 45

Bulkload with Redshift

I am trying to load the data to Redshift with bulkload. I have the required information but only change is we are using IAM Role with S3 buckets so no Access Key and Secret Key is being used. 

 

As per SAS bulkload requirements, it seems it needs an access key and secret key. Anyone out there have tried and load the data to Redshift with bulkload using IAM role or without access key and secrete key?

 

Here is the sample query that I am using. In the error it complains about the region, region is correct. I am sure it has issues with the IAM role since it is expecting a key.

 

data awswrite.myclass4(

   bulkload=yes

      bl_bucket='s3://xxx-aws-dev/saswork/'

      bl_key=99999

      bl_secret='XXXXXXXXXXXXXXX'

      bl_default_dir='/tmp'

      bl_region='us-east-1');

   set sashelp.class;

run;

 

ERROR: Message from TKS3: Unknown region.

NOTE: The DATA step has been abnormally terminated.

NOTE: The SAS System stopped processing this step because of errors.

NOTE: There were 1 observations read from the data set SASHELP.CLASS.

WARNING: The data set AWSWRITE.MYCLASS4 may be incomplete.  When this step was stopped there were 0 observations and 5 variables.

 

 

Total SQL prepare seconds were:                     0.003308

Total seconds used by the REDSHIFT ACCESS engine were     0.097987

 

ERROR: ROLLBACK issued due to errors for data set AWSWRITE.MYCLASS4.DATA.

NOTE: DATA statement used (Total process time):​

 

Contributor
Posts: 45

Re: Bulkload with Redshift

Just learned from SAS TS, IAM role is not supported yet and SAS 9.4 M4 doesn't support the secondary bucket in bulkload. No sub directory should be parent directory only. So in my example it should be simply as shown below.

 

    bl_bucket=xxx-aws-dev

Ask a Question
Discussion stats
  • 1 reply
  • 182 views
  • 0 likes
  • 1 in conversation