BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.

Hi,

I need your opinion about the issue that I am facing.

I create the sas program (base) using SEG, and everything is okay no issue on anything.

Now for the automation, it has launched in batch mode. When I test it, I got ERROR "No disk space is available for write process operation: for .utl and it happened while it tried to sort the table using the proc sort.

The table size is only around 200M, it has 2 columns inside only and it has around 2.8 millions rows.

The OS is AIX 7.1., the work library (I have checked) it still has free space of 50GB.

Here the complete ERROR:

ERROR: No disk space is available for the write operarion, Filename = /xxx/xxxxxx/temp/SAS_utilXXXXXX/utlXXXX.utl.

ERROR: Failure while attempting to write page 484 of sorted run 4.

ERROR: Failure while attempting to write page 16384 to utility file 1.

ERROR: Failure encountered to while creating initial set of sorted runs.

ERROR: Failure encountered during external sort.

ERROR: Sort execution failure.

The program is simple:

proc sort data=source out=sort_src;

by keyvar;

run;

This ERROR did not occur when I run it via SEG which is pointing to the same path for the WORK library (the path is stated on the ERROR log, which is pointing the same as the SEG).

I use also the same user ID to run the SAS code on batch and on SEG.

I tried different thing also, like using proc sql with order by instead of proc sort, but it also failed to sort the table.

I even tried also with proc summary, but it's also failed.

I match also the options between the SAS and SEG (I check the memsize and sortsize, although I don't think this ERROR is related to the memory management, and both utilloc are pointing to WORK) and they have the same value.

I have checked on the internet and don't seems to find anything about this.

I know usually this happens when there is not enough disk space on the WORK library, but it seems not in this case.

Can anyone help me on this ?

Thanks,

Dee

1 ACCEPTED SOLUTION

Accepted Solutions
Kurt_Bremser
Super User

As Jaap has suggested, I think that the user that is used to run the SAS Object Spawner has higher limits than your user. Since the spawned workspace servers inherit these limits (which I consider a massive bug, BTW. users MUST ALWAYS work under their own limits imposed by the system administrator), it is possible that this causes the differing behaviour.

Run ulimit -f from your commandline, and submit the following in EG:

filename oscmd pipe "ulimit -f";

data _null_;

infile oscmd;

input;

put _infile_;

run;

If the output is different, submit ulimit -f value_shown_in_EG before running the batch job.

View solution in original post

15 REPLIES 15
SASKiwi
PROC Star

There has to be something different about your SAS batch environment compared to your EG SAS environment.

I suggest you run PROC OPTIONS in both batch and EG and double check your UTILLOC and WORK locations just to be sure.

deasysn_gmail_com
Calcite | Level 5

Hi,

Yes, I have double checked :

I captured the option by copying the sashelp.voption to other table while running in batch and SEG (please inform me if I am doing it wrong), then I compare them. Here the comparison result:

Option Name

AIX Setting

Option Location

SEG Setting

UTILLOC

WORK

Portable

WORK

WORK

/AA/BBBBB/temp/SAS_work0F45009500D2_a01ssaseapp1a

Host

/AA/BBBBB/temp/SAS_work666700950056_a01ssaseapp1a/SAS_work7F8D00950056_a01ssaseapp1a

WORK

/AA/BBBBB/temp/SAS_work0F45009500D2_a01ssaseapp1a

Portable

/AA/BBBBB/temp/SAS_work666700950056_a01ssaseapp1a/SAS_work7F8D00950056_a01ssaseapp1a

There are slightly different on the path which in SEG, it create another temporary folder (double temporary folder), but the main folder is the same /AA/BBBBB/temp/

Any other suggestion ?

Thanks.

AndrewHowell
Moderator

Are you running as the same user whether BATCH or INTERACTIVE? (Just wondering if different users have different rights?)

Kurt_Bremser
Super User

Another cause could be your soft file size limit. Maybe the Workspace server start script resets it to a higher value with ulimit -f.

In order to get behind the cause of the error, you need to locate all your configuration sources for the Workspace Server and for SAS running in batch mode. This includes the shell scripts used to start SAS (and all their includes in the configuration tree), the sasv9.cfg and sasv9_usermods.cfg files, and the autoexec.sas and autoexec_usermods files. Somewhere there will be a difference that causes it.

You could also try a brute force approach by monitoring disk usage (df -k) while the batch job is executing.

deasysn_gmail_com
Calcite | Level 5

Hi Kurt,

I check all of them all the cfg files in the SAS app folder, the workspaceServer folder and and dbsc folder (this for SEG) and I don't see any setting that will limiting the size of memory. The additional cfg for SEG part is only to include some autoexec script which is only assigning libname only.

(I check all the cfg base on the option CONFIG in the voption table).

The only cfg that mention any memory or limit is the main cfg in SAS Foundation table (sasv9.cfg) which stated:

-memsize 2G

-sortsize 256M

Best Regards,

Deasy

Kurt_Bremser
Super User

As Jaap has suggested, I think that the user that is used to run the SAS Object Spawner has higher limits than your user. Since the spawned workspace servers inherit these limits (which I consider a massive bug, BTW. users MUST ALWAYS work under their own limits imposed by the system administrator), it is possible that this causes the differing behaviour.

Run ulimit -f from your commandline, and submit the following in EG:

filename oscmd pipe "ulimit -f";

data _null_;

infile oscmd;

input;

put _infile_;

run;

If the output is different, submit ulimit -f value_shown_in_EG before running the batch job.

deasysn_gmail_com
Calcite | Level 5

Hi KurtBremser,

I run your code, and it output : "unlimited" on SEG log;

and only stated 2097151 when I run "ulimit -f" via the command line.

I think that's the reason I don't encounter any issue on SEG.

Thank you so much for this.. I will inform this to the admin team. Smiley Happy

Kurt_Bremser
Super User

You may be able to fix this yourself. Do

ulimit -f unlimited

before running the batch job.

You see, those limits have a soft and hard value. After logging in, you operate under the soft limit, but the user herself can increase the limit up to the hard limit (which can only be changed by the superuser). As the hard limit is usually set to unlimited, you may be able to get out of your quandary without "hassling" the system admin(s).

But be advised that other things can get in your way, like user quotas etc (which a seasoned system admin will have in place)

deasysn_gmail_com
Calcite | Level 5

Hi KurtBremset,

Yes, I have tried to change it to unlimited and it did not allowed me Smiley Happy .  It's giving me this message: "The specific value is outside the user's allowable range".

So I think I have to ask the system admin.

Thanks Kurt Smiley Happy

jakarman
Barite | Level 11

When the soft file size limits are a question. remember that with SEG (Eguide) the settings of the user you are doing the login with are not used. Instead it are the settings of the object spawner's key that are inherited (sassrv).

If you want o set managed limits you have to do that scripted. It sounds weird and confusing, but have this seen happening. Only the x-cmd usage with Eguide will show you the real effective settings (use piping).

What you can see of the saswork physical naming it is nested. It is creating a saswork in a saswork by that it can do that many times. Since 5.1 SEG is supporting parallel processing.

One of the parts of that physical naming is the PID-number. as the PID is unique for running processes there can be no conflict but its is variable. There must be also something of the system, and user.

You are saying:

-  you are having a 50Gb saswork. The version of SEG is less relevant as the SAS version that runs all processes 9.3? 9.4?  

- Did mention the size of the involved dataset (200M) not the settings of memsize / sortsize. With that datasetsize it could have been done in memory with mem/sortsize settings around 2Gb

  SAS(R) 9.4 Companion for UNIX Environments, Fourth Edition (default 1GB) your physical naming convention is Unix.

- After 16384 pages it is halting. You buffersize must be really small when this is still under 200Mb. 
  The sort usually is requiring 3*(file size) something that could approach 1 Gb.

Size must not be the issue (50GB and a lot free as you checked). It could be that weird difference on user-limit settings.

monitoring disk usage while running could also give a hint.

There is a lot about the utilloc setting. SAS(R) 9.4 System Options: Reference, Third Edition

---->-- ja karman --<-----
Kurt_Bremser
Super User

With only two colums, I suspect that SAS uses a page size of 16384. Multiplied by 16384, that corresponds to 256 MB file size at the moment of the crash.

ulimit -f in the same environment where the batch job is started should clarify this.

deasysn_gmail_com
Calcite | Level 5

Hi Jaap.

The SAS version is 9.2 Smiley Happy.

I did check the space changes when running it, although it's quit fast but it seems it create util file till almost 2G.

I have tried also setting the sortsize from 512M to 3G but it's still giving the same error.

The BUFSIZE when I check both SAS Foundation and SEG has value = 0 which is the default of SAS, I haven't test it if I put different number on BUFSIZE.

I also tried redirect the UTIL table also with UTILLOC but it is failed also.

You mention that in SEG the setting on user login is not used but instead theobject spawner's key that are inherited (sassrv). Maybe that's the problem. KurtBremser also mention about -ulimit for the user also. I think I need to check this also with the admin team.

Thanks. I will keep testing. Smiley Happy

OS2Rules
Obsidian | Level 7

Hi:

Our server admin has limited us to only 200M of "C:" drive each and I have file approaching 1GB to sort.  I found that

adding a TAGSORT helps be reducing the space needed for the sort, although at the expense of longer run times.

Another thing we do is allocate a USER library to a drive with a ton of space so that all WORK. tables created are then

stored in that location. Either that or use 2 level names for your tables and allocate them to a drive with lots of space.

SAS Innovate 2025: Save the Date

 SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!

Save the date!

What is Bayesian Analysis?

Learn the difference between classical and Bayesian statistical approaches and see a few PROC examples to perform Bayesian analysis in this video.

Find more tutorials on the SAS Users YouTube channel.

SAS Training: Just a Click Away

 Ready to level-up your skills? Choose your own adventure.

Browse our catalog!

Discussion stats
  • 15 replies
  • 8265 views
  • 7 likes
  • 7 in conversation