DATA Step, Macro, Functions and more

Handling big datasets

Reply
Contributor
Posts: 72

Handling big datasets

I am creating a report using proc report which pulls data from oracle databse using proc sql. It takes lot of time when i run it in my local SAS and it crashes when running in Enterprise guide. It gives error saying ERROR: Utility file write failed. Probable disk full condition.
or dataset is corrupted. Some of the datasets in this joining has more than 200,000 records.

Once the above dataset created from proc sql there are some internal processing required. Some time it crashes during those steps saying dataset is corrupted.

Any idea about handling such a big table ? Any tips would be appreciated..
Thanks in advance.

Message was edited by: anandbillava

Message was edited by: anandbillava

Message was edited by: anandbillava Message was edited by: anandbillava
Trusted Advisor
Posts: 2,113

Re: Handling big datasets

You just need more disk space for UTILLOC. Search for on support.sas.com
Super User
Posts: 5,260

Re: Handling big datasets

So you are running EG using a local SAS installation, with ACCESS to Oracle?
If so, try to make the join to happen in Oracle. Use either explicit SQL pass-thru, or try to make your SQL join as clean as possible to make SAS to do a implicit pass-thru.
This will probably make the disk full problem go away.

/Linus
Data never sleeps
Valued Guide
Posts: 2,175

Re: Handling big datasets

anandbilliva

to see what is going on under the syntax layer, use the system option SASTRACE, like:
options sastrace=',,,d' sastraceloc=FILE "~somewhere/problem1.log" nostsuffix;
Additionally, in the proc sql statement, add option _TREE.
When you get a lot of details in the SASlog from _TREE, probably your query will be inefficient. You want a report from _TREE showing that the bulk of the query has been passed to the database server.

good luck
PeterC
Ask a Question
Discussion stats
  • 3 replies
  • 175 views
  • 0 likes
  • 4 in conversation