I have a sas data set with 1 million rows. I need to join to oracle tables but, because of oracle 1000 rows limitation on join condition I wasn’t able to extract the information from oracle.
I tried loading it into oracle temp table and do the join inside oracle. However, it takes a lot of time to load the table.
Does anyone has any suggestions?
Or
Has a macro to submit 1000 rows one by one.
Thanks for your time in advance.
The oracle limitation is over 1000 columns not rows, can you clarify what you're trying to do. It would be better to not have 1000 columns in general.
Reeza, I'm trying to join SAS table with 1 million rows to Oracle table.
The issue is that to join information between sas tables and a server SAS has to bring all the information in.
Depending on what you're trying to do, using a format, a hash table or uploading the table are options. You can search on here, some people have suggest bulkupload options that are faster but I don't know much about that.
Try libname's option readbuffer=1000 to speed
How many rows are in your Oracle table? How long does it take to upload your SAS table to Oracle?
I've been able to speed up the loading of Oracle tables by tweaking the INSERTBUFF and DBCOMMIT options (have a look at the documentation for these). You will need to experiment to get the optimal values.
libname upload oracle user=testuser password=testpass path='voyager' dbcommit=10000;
data upload.sastable;
set sastable;
run;
Bulkloading should do it even faster.
Thanks everyone for your suggestions. I was able to upload the sas data set using DBLOAD faster.
Registration is now open for SAS Innovate 2025 , our biggest and most exciting global event of the year! Join us in Orlando, FL, May 6-9.
Sign up by Dec. 31 to get the 2024 rate of just $495.
Register now!
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.
Ready to level-up your skills? Choose your own adventure.