BookmarkSubscribeRSS Feed
thanksforhelp12
Calcite | Level 5

I keep getting this error while attempting to load a large dataset (.sas7bdat) file into the workspace (Library "work" with table name "ACS") so that I can work with it.

 

No idea how to fix or why this is happening, so I haven't tried much other than clearing my work library and restarting the SAS virtual machine.

 

I have 200 GB free. I searched around and there were many posts about this happening with SQL, but at least to which I am aware, I am not doing anything with SQL.

 

Thanks for the help.

16 REPLIES 16
Reeza
Super User

@thanksforhelp12 wrote:

I keep getting this error while attempting to load a large dataset (.sas7bdat) file into the workspace (Library "work" with table name "ACS") so that I can work with it.

 

No idea how to fix or why this is happening, so I haven't tried much other than clearing my work library and restarting the SAS virtual machine.

 

I have 200 GB free. I searched around and there were many posts about this happening with SQL, but at least to which I am aware, I am not doing anything with SQL.

 

Thanks for the help.


Are you using SAS University edition? Or a custom installation?

 

A custom installation only has limits based on what your administration has set.

 

If you're using SAS UE, there are some limits based on the machine - what do you have RAM and/or cores set to? How big is your 'big' data set. You've set the shared folders to your main hard drive and not a cloud drive with a limit?

 

thanksforhelp12
Calcite | Level 5

Thank you for your response and willingness to help. I am indeed using University Edition with shared folder on my local drive.

 

The dataset is about 1,000,000 rows by 333 columns.

 

Good thought on the RAM and cores, my actual computer is 8gb/4core (virtual 😎 but I have not looked into the SAS virtual machine settings. I believe the RAM may have been set at 1024 mb. Would altering these setting help the issue you think?

Reeza
Super User
Oh,that's not big at all. Have you redirected your work library as I suggested in other posts?
thanksforhelp12
Calcite | Level 5

I have not. I tried to search your posts, is this what you are referring to? https://communities.sas.com/t5/SAS-Analytics-U/How-to-clear-up-space-in-WORK-folder/m-p/336740?

 

If so, any further tips/clarification is much appreciated.

Reeza
Super User

1. Change RAM settings in the virtual machine, I recommend 4GB, but it depends on your system of course. The SAS minimum is 1GB I believe, but 2GB seems more reasonable IMO.

2. Check the number of cores allotted, in the virtual machine, the limit is 2 cores.

3. Redirect your work library to outside of the VM, see instructions below.  

 

You can redirect your work library to a different folder using a USER libname. When a USER library is available, SAS will default data sets to there instead of WORK. This does mean that there is no clean up of temporary data sets when it shuts down though, so this becomes your responsibility. I usually create a temp folder and keep it in there and only use it on projects with size issues. 

 

 

libname user '/folders/myfolders/temp';

If your system has had several issues with large data sets or crashed while working on large data sets, SAS may not have cleaned up all the files and this can cause problems with the VM. If this happens your best bet is to just delete the VM and reinstall it. DO NOT DELETE your MYFOLDERS folder. You can reconnect to your new installation to it and be fine. 

 


@thanksforhelp12 wrote:

I have not. I tried to search your posts, is this what you are referring to? https://communities.sas.com/t5/SAS-Analytics-U/How-to-clear-up-space-in-WORK-folder/m-p/336740?

 

If so, any further tips/clarification is much appreciated.


 

 

thanksforhelp12
Calcite | Level 5

Thanks for the continued help.

 

My command where I load in the data is currently:

data work.ACS;
  set '/folders/myfolders/acs_nsqip_puf12.sas7bdat';
run;

For the above trick to work, do I need to remove the "work." piece?

 

Edit: I tried that and got this response in the log:

 

ERROR: Insufficient space in file WORK.'SASTMP-000000047'n.ITEMSTOR.
ERROR: Fatal ODS error has occurred. Unable to continue processing this output destination.
WARNING: No body file. RTF output will not be created.
ERROR: Out of space writing to file /tmp/SAS_work6DED0000097C_localhost.localdomain/#LN00076.
 
An alternative, potentially, would be if there is some way I can start filtering data from the .sas7bdat file without ever loading it in.

 

Thanks.
 

thanksforhelp12
Calcite | Level 5

Update 2: I installed a new version of the VM...

 

even though I am only trying to load a subset of the .sas7bdat file into the workspace, it seems it loads the whole thing into the WC000001 library no matter what. These then seemed to build up, even when I cleared them. Is there anyway to avoid this?


Thanks!

Reeza
Super User
Add obs = to read the file in portions? What’s the actual size, I would expect it to be under 1GB so with other changes it should work.

Data want1;
Set yourData(obs=1000);

Data want2;
Set yourdata (firstobs=1001 obs=2000);
Run;
Vince_SAS
Rhodochrosite | Level 12

@thanksforhelp12 wrote:

Edit: I tried that and got this response in the log:

 

ERROR: Insufficient space in file WORK.'SASTMP-000000047'n.ITEMSTOR.
ERROR: Fatal ODS error has occurred. Unable to continue processing this output destination.
WARNING: No body file. RTF output will not be created.
ERROR: Out of space writing to file /tmp/SAS_work6DED0000097C_localhost.localdomain/#LN00076.

 


This error indicates that ODS fails to create an RTF file.  Is it your intention to create an RTF file?  If not, then turn off that option (and also the option to create PDF, if that is also on).

 

Vince DelGobbo

SAS R&D

Tom
Super User Tom
Super User

Why are you copying the dataset into the WORK library?

Why not just run whatever analysis you need on the dataset where it is?

thanksforhelp12
Calcite | Level 5

How would one do this? This would certainly be ideal. Even when I run a "where" command after the set command above in order to just select the small subset of the large data file (which is 3GB+ for those asking), it seems to load the entire file into the WC000001 library. Is there a better way to load in the data?

 

Thanks.

Tom
Super User Tom
Super User

All the datastep you posted does is make a duplicate copy of the data. It is not DOING anything with it.

What do you want to DO with the data?

Do you want to run a regression?

data reg data= '/folders/myfolders/acs_nsqip_puf12.sas7bdat';
  model response=input1 input2;
run;

 

Vince_SAS
Rhodochrosite | Level 12

I don't understand how this code shown earlier:

 

data work.ACS;
  set '/folders/myfolders/acs_nsqip_puf12.sas7bdat';
run;

 

Is causing this error message, also shown earlier:

 

ERROR: Insufficient space in file WORK.'SASTMP-000000047'n.ITEMSTOR.
ERROR: Fatal ODS error has occurred. Unable to continue processing this output destination.
WARNING: No body file. RTF output will not be created.

 

Is there more code than what has been already posted?

 

Vince DelGobbo

SAS R&D

thanksforhelp12
Calcite | Level 5

I just reviewed my file and I have absolutely nothing referencing RTF anywhere in it. Nonetheless, it is possible there may have been something there before.

SAS Innovate 2025: Save the Date

 SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!

Save the date!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

SAS Training: Just a Click Away

 Ready to level-up your skills? Choose your own adventure.

Browse our catalog!

Discussion stats
  • 16 replies
  • 4076 views
  • 2 likes
  • 4 in conversation