<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Handling big datasets in SAS Programming</title>
    <link>https://communities.sas.com/t5/SAS-Programming/Handling-big-datasets/m-p/11487#M1077</link>
    <description>So you are running EG using a local SAS installation, with ACCESS to Oracle?&lt;BR /&gt;
If so, try to make the join to happen in Oracle. Use either explicit SQL pass-thru, or try to make your SQL join as clean as possible to make SAS to do a implicit pass-thru.&lt;BR /&gt;
This will probably make the disk full problem go away.&lt;BR /&gt;
&lt;BR /&gt;
/Linus</description>
    <pubDate>Mon, 03 May 2010 11:58:07 GMT</pubDate>
    <dc:creator>LinusH</dc:creator>
    <dc:date>2010-05-03T11:58:07Z</dc:date>
    <item>
      <title>Handling big datasets</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Handling-big-datasets/m-p/11485#M1075</link>
      <description>I am creating a report using proc report which pulls data from oracle databse using proc sql. It takes lot of time when i run it in my local SAS and it crashes when running in Enterprise guide. It gives error saying ERROR: Utility file write failed.  Probable disk full condition.&lt;BR /&gt;
 or dataset is corrupted. Some of the datasets in this joining has more than 200,000 records.&lt;BR /&gt;
&lt;BR /&gt;
Once the above dataset created from proc sql there are some internal processing required. Some time it crashes during those steps saying dataset is corrupted.&lt;BR /&gt;
&lt;BR /&gt;
Any idea about handling such a big table ? Any tips would be appreciated..&lt;BR /&gt;
Thanks in advance.&lt;BR /&gt;
&lt;BR /&gt;
Message was edited by: anandbillava&lt;BR /&gt;
&lt;BR /&gt;
Message was edited by: anandbillava&lt;BR /&gt;
&lt;BR /&gt;
Message was edited by: anandbillava

Message was edited by: anandbillava</description>
      <pubDate>Wed, 28 Apr 2010 14:39:18 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Handling-big-datasets/m-p/11485#M1075</guid>
      <dc:creator>anandbillava</dc:creator>
      <dc:date>2010-04-28T14:39:18Z</dc:date>
    </item>
    <item>
      <title>Re: Handling big datasets</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Handling-big-datasets/m-p/11486#M1076</link>
      <description>You just need more disk space for UTILLOC.  Search for &lt;UTILITY file="" write="" failed.=""&gt; on support.sas.com&lt;/UTILITY&gt;</description>
      <pubDate>Wed, 28 Apr 2010 16:02:37 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Handling-big-datasets/m-p/11486#M1076</guid>
      <dc:creator>Doc_Duke</dc:creator>
      <dc:date>2010-04-28T16:02:37Z</dc:date>
    </item>
    <item>
      <title>Re: Handling big datasets</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Handling-big-datasets/m-p/11487#M1077</link>
      <description>So you are running EG using a local SAS installation, with ACCESS to Oracle?&lt;BR /&gt;
If so, try to make the join to happen in Oracle. Use either explicit SQL pass-thru, or try to make your SQL join as clean as possible to make SAS to do a implicit pass-thru.&lt;BR /&gt;
This will probably make the disk full problem go away.&lt;BR /&gt;
&lt;BR /&gt;
/Linus</description>
      <pubDate>Mon, 03 May 2010 11:58:07 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Handling-big-datasets/m-p/11487#M1077</guid>
      <dc:creator>LinusH</dc:creator>
      <dc:date>2010-05-03T11:58:07Z</dc:date>
    </item>
    <item>
      <title>Re: Handling big datasets</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Handling-big-datasets/m-p/11488#M1078</link>
      <description>anandbilliva&lt;BR /&gt;
 &lt;BR /&gt;
to see what is going on under the syntax layer, use the system option SASTRACE, like:&lt;BR /&gt;
  options sastrace=',,,d' sastraceloc=FILE "~somewhere/problem1.log" nostsuffix;&lt;BR /&gt;
Additionally, in the proc sql statement, add option   _TREE. &lt;BR /&gt;
When you get a lot of details in the SASlog from _TREE, probably your query will be inefficient. You want a report from _TREE showing that the bulk of the query has been passed to the database server.&lt;BR /&gt;
 &lt;BR /&gt;
good luck&lt;BR /&gt;
PeterC</description>
      <pubDate>Wed, 05 May 2010 12:38:52 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Handling-big-datasets/m-p/11488#M1078</guid>
      <dc:creator>Peter_C</dc:creator>
      <dc:date>2010-05-05T12:38:52Z</dc:date>
    </item>
  </channel>
</rss>

