BookmarkSubscribeRSS Feed
Egrodrigues2014
Fluorite | Level 6

Hi experts,

 

We are trying to copy a work table (72 millions obs) to a POSTGRES database and it is taking a long time to run. We had a similar issue to copy a huge work table to SQL Server database and we solve it put INSERTBUFF options (32767) at library level.

Down below I share the comparison between copying a work table to SQL Server (we already have it optimized) VS copying it to POSTGRES database:

Egrodrigues2014_6-1666707302836.png

 

Postgres execution (STG_PG.PRUEBA3):

Egrodrigues2014_7-1666707316214.png

SQL Server execution (staging.PRUEBA3):

Egrodrigues2014_8-1666707335534.png

At the odbc.ini level this is the configuration for SAS/ACCESS Interface to PostgreSQL:

Egrodrigues2014_9-1666707347782.png

Finally, we execute the following code (same as before) with BULK_LOAD, BL_PSQL_PATH and BL_DELETE_DATAFILE at dataset options level and real time decreased:

 

 

Egrodrigues2014_10-1666707367267.png

 

 

Egrodrigues2014_11-1666707373731.png

Is there a way to configure these options at the library level (BL_PSQL_PATH and BL_DELETE_DATAFILE)? How can I set these options so that BULK LOAD is always active?

 

Please, if anyone has faced this problem before or knows what is missing, please, let us know. It will be more than welcome.

 

Thanks in advance.

5 REPLIES 5
Tom
Super User Tom
Super User

I would suggest opening a support ticket with SAS so they can walk you through the options available.

 

Not sure why you went to the effort of taking photographs of the text of your SAS code and log and then annotating the text with little red squares.  It would be a lot easier to just copy and paste the actual text.  Plus it would be a lot easier to read.

 

What ENGINE are you using to connect to POSTGRES?  What ENGINE did you use to connect to the other database?  Have you licensed any of the database specific SAS/Access products from SAS?

Egrodrigues2014
Fluorite | Level 6

Hi @Tom , thanks for the suggestions. 

 

I am using POSTGRES and ODBC engine to connect to POSTGRES and SQL Server respectively. These products are licensed. 

LinusH
Tourmaline | Level 20

Performance is quite hard to tackle.

You want to isolate the different system components.

For instance, can you do isolated (local) bulk-loads in each DB? That will tel you if this a DB specific problem or not.

Then there's networking.

Also, is there's a way to verify that bulk load is actually in effect for Postgres?

 

Finally, I get a bit confused by the log for Postgres, since it states "SQLSVR engine" - maybe a "bug" in the SAS/ACCESS to Postgres engine?

Data never sleeps
SASKiwi
PROC Star

You may need to try the DBCOMMIT as well to see if that improves performance. Setting the bulk load options for all POSTGRES queries seems a bit dangerous to me as it is really only appropriate for large data loads. As @Tom says, working through the options with Tech Support is your best option.

Ksharp
Super User
Also could try other PROC like :
PROC SQL
PROC COPY
PROC APPEND

to copy data into PG .

sas-innovate-2024.png

Available on demand!

Missed SAS Innovate Las Vegas? Watch all the action for free! View the keynotes, general sessions and 22 breakouts on demand.

 

Register now!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 5 replies
  • 819 views
  • 1 like
  • 5 in conversation