Community talk about communities.sas.com. Meta!

Bulkload with Redshift

Reply
Contributor
Posts: 45

Bulkload with Redshift

Does anyone know what this error is about?

I am trying to use SAS Bulkload to write a dataset to Redshift. It is able to create the dataset but throws ERROR: Message from TKS3: Access Denied error at the end so table is not getting written to Redshift.

 

 

NOTE: There were 10 observations read from the data set SASHELP.CLASS.
NOTE: The data set AWSWRITE.RDBULKTEST3 has 10 observations and 5 variables.
REDSHIFT: Bulkload seconds used for writes: 0.000179
REDSHIFT: Bulkload seconds used for file close: 0.001464
REDSHIFT: Bulkload seconds used for file delete: 0.163959
REDSHIFT: Bulkload total seconds used: 0.167681

Summary Statistics for REDSHIFT are:
Total SQL execution seconds were: 0.093207
Total SQL prepare seconds were: 0.001904
Total seconds used by the REDSHIFT ACCESS engine were 0.263587

ERROR: Message from TKS3: Access Denied
NOTE: DATA statement used (Total process time):
real time 0.32 seconds

Valued Guide
Posts: 589

Re: Bulkload with Redshift

You may not have access to create tables or your giving the wrong credentials. Are you using the below code or if something different please post your code.

options sastrace=',,,ds' sastraceloc=saslog nostsuffix;

libname libred redshift server=rsserver db=rsdb user=myuserID pwd=myPwd port=5439;

data libred.myclass(
   bulkload=yes
      bl_bucket=myBucket
      bl_key=99999
      bl_secret=12345
      bl_default_dir='/tmp'
      bl_region='us-east-1');
   set sashelp.class;
run;
Thanks,
Suryakiran
Contributor
Posts: 45

Re: Bulkload with Redshift

Posted in reply to SuryaKiran

I am using the same code with correct access key and secrete access keys, and have access to create table or write to bucket - this is all verified by writing to bucket from the server and writing a table to redshift schema.

 

 

Contributor
Posts: 45

Re: Bulkload with Redshift

Posted in reply to SuryaKiran

Here is my code if that makes any difference.

 

options sastrace=',,,ds' sastraceloc=saslog nostsuffix;

LIBNAME AWSWRITE SASIORST DATABASE="midasdr-test2-rs" SERVER="XXXXXXXXXXXXXXXXXX"
SCHEMA=saswork USER=saswork PASSWORD="XXXXXX" PORT=5439;

data AWSWRITE.rdbulktest3(
bulkload=yes
bl_bucket='maxuwjna'
bl_key='XXXXXXXXXXXXXXXXXXXXX'
bl_secret='XXXXXXXXXXXXXXXXXXX'
bl_default_dir='/tmp'
bl_region='us-east-1'
bl_use_ssl=yes);
set sashelp.class(obs=10);
run;

 

Contributor
Posts: 45

Re: Bulkload with Redshift

The issue was the keys. Keys were belong to a different user than the user id I was using in my libname. Make sure your AWS admin provides the key for the user that you are using in SAS to Redshift connectivity since that ID is being used to copy the data from S3 bucket to Redshift schema. 

Ask a Question
Discussion stats
  • 4 replies
  • 197 views
  • 0 likes
  • 2 in conversation