DATA Step, Macro, Functions and more

SAS accessing large table names on Amazon redshift

Accepted Solution Solved
Reply
New Contributor
Posts: 3
Accepted Solution

SAS accessing large table names on Amazon redshift

Hello,

Has anyone dealt with how to reference tables names larger than 32 characters on redshift through proc sql?

Thanks.

Susan

BASE SAS V9 using ODBC connecting to Amazon redshift via proc sql and a libname statement.

proc sql;
create table nutslp as 

select blah
	from prod.member m 
inner join prod.member_activities_by_activitytype ma
	on a.action_type = ma.activitytype 
;
quit;


partial log below:
637  inner join prod.member_activities_by_activitytype ma
                     ---------------------------------
                     65
ERROR 65-58: Name 'MEMBER_ACTIVITIES_BY_ACTIVITYTYPE' is too long for a SAS name in this context.




Accepted Solutions
Solution
‎03-10-2016 05:11 PM
Super User
Posts: 3,101

Re: SAS accessing large table names on Amazon redshift

You've coded it wrong:

 

proc sql;
connect to ODBC as myred (datasrc='bbb' user=me password=ppp ); 
create table nutslp as 
select *
from connection to myred
(select *
from member m 
inner join member_activities_by_activitytype ma
	on m.action_type = ma.activitytype 
)
;
quit;

View solution in original post


All Replies
Super User
Posts: 5,255

Re: SAS accessing large table names on Amazon redshift

The same as any other RDBMS with names >32 chars. Either have views created for you with adjusted names, or use explicit SQL pass thru.
Data never sleeps
New Contributor
Posts: 3

Re: SAS accessing large table names on Amazon redshift

Thanks but the explicit sql pass through did not seem to work.  Unless I coded it wrong.  I got the same message as before.

proc sql;
connect to ODBC as myred (datasrc='bbb' user=me password=ppp ); 
create table nutslp as 
select *
from member m 
inner join member_activities_by_activitytype ma
	on m.action_type = ma.activitytype 
;
quit;
Solution
‎03-10-2016 05:11 PM
Super User
Posts: 3,101

Re: SAS accessing large table names on Amazon redshift

You've coded it wrong:

 

proc sql;
connect to ODBC as myred (datasrc='bbb' user=me password=ppp ); 
create table nutslp as 
select *
from connection to myred
(select *
from member m 
inner join member_activities_by_activitytype ma
	on m.action_type = ma.activitytype 
)
;
quit;
New Contributor
Posts: 3

Re: SAS accessing large table names on Amazon redshift

Thank you SASKiwi, worked like a charm :-)

Super User
Posts: 3,101

Re: SAS accessing large table names on Amazon redshift

Glad to be of service! Smiley Happy

Contributor
Posts: 29

Re: SAS accessing large table names on Amazon redshift

Hi, I have a similar problem.

I just moved to a new company where we don't have a scratch DB on the SQl server. To surcumvent this, I used to upload my table with the primary key to the scratch DB; then connect to the DB and execute SQL on the server to join my table; then pull it to my local machine. However, as I mentioned, we don't have a scratch DB here.

 

I tried the above example, but it still gives me an error...i hope you can help:

 

/*example*/

data tableA (drop= i);

do i = 1 to 10;

sys_ID = i;

output;

end;

run;

 

proc sql;

connect to odbc as imaging (dsn=imaging);

create table tableB as

select *

from connection to imaging

(select *

from work.tableA as a

left join imaging.dbo.AdviceSheetAdditionalRequirements b

on a.sys_ID = b.sys_ID

)

order by a.sys_ID

;

disconnect from odbc;

quit;

 

 

The Log window says:

ERROR: CLI describe error: [Microsoft][ODBC SQL Server Driver][SQL Server]Invalid object name

'work.tableA'. : [Microsoft][ODBC SQL Server Driver][SQL Server]Statement(s) could not be

prepared.

 

 

 

☑ This topic is SOLVED.

Need further help from the community? Please ask a new question.

Discussion stats
  • 6 replies
  • 374 views
  • 0 likes
  • 4 in conversation