BookmarkSubscribeRSS Feed
Matt
Quartz | Level 8

Does anyone know what this error is about?

I am trying to use SAS Bulkload to write a dataset to Redshift. It is able to create the dataset but throws ERROR: Message from TKS3: Access Denied error at the end so table is not getting written to Redshift.

 

 

NOTE: There were 10 observations read from the data set SASHELP.CLASS.
NOTE: The data set AWSWRITE.RDBULKTEST3 has 10 observations and 5 variables.
REDSHIFT: Bulkload seconds used for writes: 0.000179
REDSHIFT: Bulkload seconds used for file close: 0.001464
REDSHIFT: Bulkload seconds used for file delete: 0.163959
REDSHIFT: Bulkload total seconds used: 0.167681

Summary Statistics for REDSHIFT are:
Total SQL execution seconds were: 0.093207
Total SQL prepare seconds were: 0.001904
Total seconds used by the REDSHIFT ACCESS engine were 0.263587

ERROR: Message from TKS3: Access Denied
NOTE: DATA statement used (Total process time):
real time 0.32 seconds

4 REPLIES 4
SuryaKiran
Meteorite | Level 14

You may not have access to create tables or your giving the wrong credentials. Are you using the below code or if something different please post your code.

options sastrace=',,,ds' sastraceloc=saslog nostsuffix;

libname libred redshift server=rsserver db=rsdb user=myuserID pwd=myPwd port=5439;

data libred.myclass(
   bulkload=yes
      bl_bucket=myBucket
      bl_key=99999
      bl_secret=12345
      bl_default_dir='/tmp'
      bl_region='us-east-1');
   set sashelp.class;
run;
Thanks,
Suryakiran
Matt
Quartz | Level 8

I am using the same code with correct access key and secrete access keys, and have access to create table or write to bucket - this is all verified by writing to bucket from the server and writing a table to redshift schema.

 

 

Matt
Quartz | Level 8

Here is my code if that makes any difference.

 

options sastrace=',,,ds' sastraceloc=saslog nostsuffix;

LIBNAME AWSWRITE SASIORST DATABASE="midasdr-test2-rs" SERVER="XXXXXXXXXXXXXXXXXX"
SCHEMA=saswork USER=saswork PASSWORD="XXXXXX" PORT=5439;

data AWSWRITE.rdbulktest3(
bulkload=yes
bl_bucket='maxuwjna'
bl_key='XXXXXXXXXXXXXXXXXXXXX'
bl_secret='XXXXXXXXXXXXXXXXXXX'
bl_default_dir='/tmp'
bl_region='us-east-1'
bl_use_ssl=yes);
set sashelp.class(obs=10);
run;

 

Matt
Quartz | Level 8

The issue was the keys. Keys were belong to a different user than the user id I was using in my libname. Make sure your AWS admin provides the key for the user that you are using in SAS to Redshift connectivity since that ID is being used to copy the data from S3 bucket to Redshift schema. 

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

Discussion stats
  • 4 replies
  • 2158 views
  • 0 likes
  • 2 in conversation