BookmarkSubscribeRSS Feed
deleted_user
Not applicable
Hi, I have maxed out my -memsize 2000m (2gb RAM) on my PC and am still getting out of resource error msgs for some major PROC SQL joins. I'm doing cartesian joins with 38,000 records (x 38,000 so yes the resulting set is HUGE). I will be setting my options to CLEANUP so the ongoing space is cleaned up.

My main question is: how do I even go about getting a dedicated server (UNIX) for my home business? I checked out Dell servers, some of 8GB of RAM but is that enough to crunch these kind of #s? I deal with a LOT of data and need maximum processing power. I'm not sure I want to spend the $ on the SAS server either (performance scalable server).

What are my options? Should I bite the bullet and get my own Unix server? has anyone done this for their own small business?

-Christy 703-304-9360 cell #
cwarner@autoaudit.com
6 REPLIES 6
LinusH
Tourmaline | Level 20
How large are your input tables (in bytes)? Exactly what kind of error messages are you getting?
I believe this might have to do with disk space rather than RAM (even if large RAM will ease the burden of I/O). If you want a larger server with substantially more RAM, UNIX is the most reliable option, bit I think a Win 64-bit server will do as well.

SPD Server is in many ways a brilliant product, but in the area of cartesian product, joins I don't think SPDS will do anything different than Base SAS SQL, there's nothing really to optimize in a true cartesian product join (I think...).

Regards,
Linus
Data never sleeps
deleted_user
Not applicable
Hi ! thanks for the reply.

here is my error msg. first, a pop-up box comes up saying "OUT OF RESOURCES" disk full. but I'm pointing OUTLIB. (permanent library) to my external hard-drive that has 350+ GB so I doubt that it is running out of space....
?

here is the msg:
ERROR: Write to OUTLIB.SUBSET.DATA failed. File is full and may be damaged.
ERROR: Insufficient space in file OUTLIB.SUBSET.DATA.
NOTE: Compressing data set OUTLIB.SUBSET decreased size by 98.44 percent.
Compressed is 262143 pages; un-compressed would require 16820629 pages.
ERROR: Write to OUTLIB.SUBSET.DATA failed. File is full and may be damaged.

thanks for any help!

I'm running base/sas 8.2 on a PC laptop w/ 2gm RAM on Windows Vista
deleted_user
Not applicable
here is my original SAS code:

proc sql;
create table OUTLIB.SUBSET as
select a.SCRUBBED_NAME as name1,
b.SCRUBBED_NAME as name2,
a.SCRUBBED_ADDRESS as address1,
b.SCRUBBED_ADDRESS as address2,
a.id as id1,
b.id as id2
from subset a, subset b
where A.ID LT B.ID and
A.NEWCITY = B.NEWCITY AND
A.REGION = B.REGION;
quit;

so I only have 6 columns in this table to cut down on the disk-usage: name1 name2, address1, address2, and id1, id2.

I'm doing complex address matching where I'm matching each word in the address of one table with each address in the same table (self-join) where the CITY and STATE (region) are equal.
deleted_user
Not applicable
is there a size-limit on a SAS dataset, that is un-related to how much hard-drive space I have? could I actually be hitting 300GB for this joined dataset that I'm creating? maybe I could break it down and try it for 1/10 of the observations and see what the size is, that would tell me I guess.....

thanks!
LinusH
Tourmaline | Level 20
The size limit for SAS tables is not the issue here, they can at least be 2.1 TB in NTFS (if you are using FAT16/32, it might be a problem, see http://support.sas.com/kb/8/213.html).

Yes, I think it's a good idea to run the query again with a smaller portion of data. While it's running, open the Windows explorer and the directory that you use for OUTLIB, and you can easily browse how your table grows.

Also check that you don't run out of disk space for your saswork.

/Linus
Data never sleeps
1162
Calcite | Level 5
According to your error message, you are running out of disk space, but I bet it has nothing to do with your external hard-drive.

Although you're pointing to OUTLIB. as your destination, SAS doesn't actually write the file to that destination until the processing is complete. While SAS is processing the file, a temporary file is created. This temporary file is probably going onto your local C: drive.

Try looking in Documents & Settings\ ... \ Local Settings \ Temp \ SAS Temporary Files. This is probably where you'll see a temporary file being generated while SAS is processing.

As far as the solution, I think there are ways to point to another location for these temporary files. Another solution might be to break the job into 'chunks', process each of these 'chunks' and then append them together at the end.

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

New Learning Events in April

 

Join us for two new fee-based courses: Administrative Healthcare Data and SAS via Live Web Monday-Thursday, April 24-27 from 1:00 to 4:30 PM ET each day. And Administrative Healthcare Data and SAS: Hands-On Programming Workshop via Live Web on Friday, April 28 from 9:00 AM to 5:00 PM ET.

LEARN MORE

Discussion stats
  • 6 replies
  • 8822 views
  • 0 likes
  • 3 in conversation