BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
A_SAS_Man
Pyrite | Level 9

The force option does not work either, I am guessing this ties back to my original issue of not being able to create temp tables on this database server.

hashman
Ammonite | Level 13

@A_SAS_Man:

If I were in your shoes (and I've been in shoes of this kind), I'd first find out how temporary table, local and global are created in the data base you are working with. For example, read this:

 

https://blog.sqlauthority.com/2017/10/18/sql-server-create-table-variable-temporary-table/  

https://blog.sqlauthority.com/2017/10/21/sql-server-create-global-temporary-table/

 

Furthermore, SAS supports RDBMS temporary tables via SAS/Access:

 

https://documentation.sas.com/?docsetId=acreldb&docsetTarget=n0irpkyp22l7vzn1il9lx6f4wmx9.htm&docset... 

 

There may be a specific piece of SAS/Access syntax you need to use. For example, in DB2 (and I believe Oracle), global temp tables are enacted using SCHEMA=SESSION. Contact SAS Support and ask "How to I create and access a MS SQL Server temporary table in my SAS session? What are the LIBNAME statement specifics to make it happen?"

 

Kind regards

Paul D. 

hashman
Ammonite | Level 13

@A_SAS_Man:

The "normal" way to do it:

  1. Create a global temporary table (GTT) on the SQL server side with ID as a key
  2. Upload your SAS data set with the list of IDs to GTT, preferably using bulk load
  3. Run an explicit pass-thru query server side with: 

 

where a in (select id from <your GTT>)

 

This way, you're limited only by the permissible size of GTT. It may be limited since it's allocated to your personal table space but it surely large enough to accommodate way more than 2**16 rows. One snag that does happen is that your particular installation does't allow user GTTs - I've seen it happen. If this should be the case, write code to break the IN clause into chunks of allowable size placed into a series of macro variables, and then execute:

where a in (&chunk1) or a in (&chunk2) ... or a in (&chunkN) 

Alternatively, write code to create a temp SAS table from each chunk using a separate query and append each chunk to the needed output file as base. The latter works especially well when A is indexed on the server side. We used to write code to generate code that did exactly that 25+ years ago against DB2 on the mainframe when machine resources were more limited than now by a couple of orders of magnitude. When there's a need, there's a way.

 

Kind regards

Paul D. 

Patrick
Opal | Level 21

@hashman 

This bit....

where a in (&chunk1) or a in (&chunk2) ... or a in (&chunkN) 

... won't work for the OT because the SQL code would exceed the 64KB limit.

hashman
Ammonite | Level 13

@Patrick:

In commercial RDBMSa the limit is not on the SQL as a whole but on the number of items in the IN operator list. I've seen working Oracle queries spanning tens of pages - certainly way over 64k limit. 

ChrisNZ
Tourmaline | Level 20

@hashman Quite.

And my macro splits the list of values into several IN lists separated by the OR operator, with no limit on the overall length. The default is 1000 values per IN list, which is (or used to be) the limit for Oracle.

A_SAS_Man
Pyrite | Level 9

I want to thank everyone who contributed to this thread for all your help, I recognize this was a difficult problem with me being unable to share exact code and you all not being familiar with our warehouse structure but I really appreciate everyone working with me. After all my testing there are two options which have had some level of success with a few snags I'm trying to work through still.

 

1. @ChrisNZ with the macro to split the in statement into multiple groups. This one is working right now for small amounts of values, I seem to be running into a problem where it gets exponentially slower as I get above 30k values but it is working.

 

2. @Patrick with the library statement to connect to the database rather then a connect to statement. This is working for simpler queries, I'm still attempting to scale it up to work on slightly more complex ones with a join on the db side and haven't quite had that work out yet.

 

As soon as I complete my testing and figure out which of these is most efficient/capable in our warehouse system I will select an answer and if needed I will edit this post to provide more context. Once again I am very appreciative of the time everyone put into this, not just @ChrisNZ and @Patrick.

ChrisNZ
Tourmaline | Level 20

@A_SAS_Man Scanning a long list of values for each record is very inefficient.

An upload + inner join is preferable and much faster if possible.

The macro was written to address the cases where an upload is not possible.

A_SAS_Man
Pyrite | Level 9

Yeah, that's the case I'm dealing with. Unable to upload the data to create a temp table. Thanks again for your help.

Patrick
Opal | Level 21

@hashman 

O.K., got this wrong. The maximum for SQL Server is 65,536 * Network Packet Size as documented here

hashman
Ammonite | Level 13

@Patrick:

Errare humanum est, perseverare diabolicum ;).

 

Thanks for finding the definitive info and for the link.  

 

Kind regards

Paul D.

sas-innovate-2024.png

Available on demand!

Missed SAS Innovate Las Vegas? Watch all the action for free! View the keynotes, general sessions and 22 breakouts on demand.

 

Register now!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 70 replies
  • 4005 views
  • 10 likes
  • 11 in conversation