BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
sat_lr
Calcite | Level 5

Hi,

I am trying to import a raw data file (flat file in txt format) into SAS EG. The details of the file and data size are given below. I am getting 'Insufficient Space Error' which is pasted below.

Data size - 1.05 GB

file format - TXT file

No. of Rows - 4.5 Million

No. of Columns - 46

IMPORTERROR.png

I have checked my server space as well which has 45 GB of space. I tried importing in WORK library and also I attempted to import to one of the SAS registered permanently library however, I keep getting this error that the space is insufficient. Any suggestions would be greatly appreciated. Thanks in advance.

1 ACCEPTED SOLUTION

Accepted Solutions
Kurt_Bremser
Super User

1. What DBailey said, it depends a lot on the definition of your columns.

2. Anytime you have a SAS data set with long(er) char columns that are mostly filled with blanks, use the compress=yes data set option. The compute overhead is rather small and always offset by the reduced I/O load. With long text fields, the compression ratio can reach > 95% :smileyalert:

View solution in original post

10 REPLIES 10
DBailey
Lapis Lazuli | Level 10

When you say server space...you do mean free space and not total, right?

Some systems have user quotas so that any one user can't consume the entire disk.  Sometimes libraries map to different file systems.  Assuming your sys admin tells you there are no quotas, you might look at the path to the WORK library and get the free space available in that path. 

sat_lr
Calcite | Level 5

even tried with WORK. the SAS server space is of 45 GB and its taking up the entire space and the import was not even 40% complete. Every SAS user in the organization has 15 GB of space. However, why will the data size explodes to such a size in first place? If it goes smooth, then the data size will reach close to 120 GB which all started with 1GB data? Kindly assist.

DBailey
Lapis Lazuli | Level 10

How are you defining the columns on the import?  SAS does not have variable character length, so if you have an incoming file with 100 records and one column in each record with column width of 10 (1000 bytes or so) but you define the input column as char(100), the resulting SAS dataset will be 10,000 (100 records x 100 bytes per record).

Doc_Duke
Rhodochrosite | Level 12

The space issue is with the bsl_data library, not work.  The could be on the same server share, but should not be.  In *nix, it is common for SysAdmins to limit the space for users or data shares.

Limit you import to a few thousand lines and see what it uses; then you can extrapolate your needs or modify your input.

DBailey
Lapis Lazuli | Level 10

I was just trying to suggest he ask SAS to let him know how much free space it saw in each of the available libraries.  I think that even if WORK is on the same filesystem, it can have different quotas.

Kurt_Bremser
Super User

1. What DBailey said, it depends a lot on the definition of your columns.

2. Anytime you have a SAS data set with long(er) char columns that are mostly filled with blanks, use the compress=yes data set option. The compute overhead is rather small and always offset by the reduced I/O load. With long text fields, the compression ratio can reach > 95% :smileyalert:

data_null__
Jade | Level 19

Sounds like this has been mostly diagnosed but I will add this.  While I can't tell from your screen shot exactly what you are using the read the file, some sort of wizard or PROC IMPORT perhaps.  If that is the case I would abandon that and write a data step where you have control over how the variables are defined (no guessing) and the rest.  Plus using COMPRESS as suggested and you can probably get it all to work nicely.

jakarman
Barite | Level 11

The Server side is a Unix system.
There is no problem in importing the data that it has put on temporary "\saswork"

When it should go to the real end location it bloops writing to BSL_DATA.SPSALES dataset. It is that location having issues of shortage in space.

You cannot check the space on Unix in the same way as WIndows. The HFS can be set up with many file-sytems each having his own limit.

The ulimit options on the account (inherited from teh object spawner)  should not be a problem as the data was cerated in work and being copied to another location

Check the location of BSL_DATA with the:  "df -g ."  command.

You could do that when having XCMD access using piping mentioning the real directory location instead of the "." .

You are running by the key satya on machine "sasdemo" is it a eductional purpose?

---->-- ja karman --<-----
sat_lr
Calcite | Level 5

Guys, thanks very much for all your replies, it helped me to fix the issue. It was the char. length issue due to which the size was expanding. Now I have got my data set into SAS with the same size of the source file with no issue.

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

SAS Enterprise Guide vs. SAS Studio

What’s the difference between SAS Enterprise Guide and SAS Studio? How are they similar? Just ask SAS’ Danny Modlin.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 10 replies
  • 2917 views
  • 10 likes
  • 6 in conversation