05-10-2013 03:42 PM
I have a sas data set with 1 million rows. I need to join to oracle tables but, because of oracle 1000 rows limitation on join condition I wasn’t able to extract the information from oracle.
I tried loading it into oracle temp table and do the join inside oracle. However, it takes a lot of time to load the table.
Does anyone has any suggestions?
Has a macro to submit 1000 rows one by one.
Thanks for your time in advance.
05-10-2013 03:48 PM
The oracle limitation is over 1000 columns not rows, can you clarify what you're trying to do. It would be better to not have 1000 columns in general.
05-10-2013 06:24 PM
The issue is that to join information between sas tables and a server SAS has to bring all the information in.
Depending on what you're trying to do, using a format, a hash table or uploading the table are options. You can search on here, some people have suggest bulkupload options that are faster but I don't know much about that.
05-12-2013 07:10 PM
How many rows are in your Oracle table? How long does it take to upload your SAS table to Oracle?
I've been able to speed up the loading of Oracle tables by tweaking the INSERTBUFF and DBCOMMIT options (have a look at the documentation for these). You will need to experiment to get the optimal values.
libname upload oracle user=testuser password=testpass path='voyager' dbcommit=10000;
Bulkloading should do it even faster.