Tuesday
Rick_SAS
SAS Super FREQ
Member since
06-23-2011
- 7,341 Posts
- 6,390 Likes Given
- 1,226 Solutions
- 4,159 Likes Received
About
Rick Wicklin, PhD, is a distinguished researcher in computational statistics at SAS and is a principal developer of SAS/IML softare. His areas of expertise include computational statistics, statistical graphics, simulation, and modern methods in statistical data analysis. Rick is author of the books _Statistical Programming with SAS/IML Software_ and _Simulating Data with SAS_.
SAS Programming, matrix computations, statistical graphics, statistical simulation, modern data analysis. Rick has received multiple awards for his participation on SAS discussion forums.
-
Latest posts by Rick_SAS
Subject Views Posted 86 a week ago 672 2 weeks ago 789 3 weeks ago 671 a month ago 730 a month ago 766 a month ago 220 a month ago 382 01-05-2026 03:27 PM 433 01-05-2026 06:47 AM 404 12-31-2025 07:24 AM -
Activity Feed for Rick_SAS
- Liked Re: How to Reg on each row?! with Slope/Intercept saved out?! for Ksharp. Tuesday
- Liked Re: How to Reg on each row?! with Slope/Intercept saved out?! for jiltao. Tuesday
- Liked Re: proc command for Ksharp. Tuesday
- Liked Re: proc command for Tom. Tuesday
- Liked Re: MMRM Proc Mixed: for jiltao. Tuesday
- Got a Like for Re: Burr distribution CDF. a week ago
- Posted Re: Burr distribution CDF on Statistical Procedures. a week ago
- Posted Re: I want to write a SAS/IML module that invokes ODS statements at execution time, not compilation on SAS Programming. 2 weeks ago
- Posted Re: I want to write a SAS/IML module that invokes ODS statements at execution time, not compilation on SAS Programming. 3 weeks ago
- Liked Creating Synthetic Data using PROC SMOTE, PROC SIMNORMAL and PROC TABULARGAN for gsvolba. 3 weeks ago
- Liked Re: Visualizing associations between categorical variables for Ksharp. 4 weeks ago
- Liked Re: Association between nominal variable and ordinal variable for StatDave. 4 weeks ago
- Got a Like for Re: Overwriting inside a macro loop leads to Out of memory for symbols ERROR. a month ago
- Got a Like for Re: Overwriting inside a macro loop leads to Out of memory for symbols ERROR. a month ago
- Got a Like for Re: Overwriting inside a macro loop leads to Out of memory for symbols ERROR. a month ago
- Got a Like for Re: Overwriting inside a macro loop leads to Out of memory for symbols ERROR. a month ago
- Got a Like for Re: Overwriting inside a macro loop leads to Out of memory for symbols ERROR. a month ago
- Posted Re: Overwriting inside a macro loop leads to Out of memory for symbols ERROR on SAS/IML Software and Matrix Computations. a month ago
- Got a Like for Re: Overwriting inside a macro loop leads to Out of memory for symbols ERROR. a month ago
- Got a Like for Re: Overwriting inside a macro loop leads to Out of memory for symbols ERROR. a month ago
-
My Liked Posts
Subject Likes Posted 2 a week ago 11 a month ago 1 01-05-2026 03:27 PM 4 01-05-2026 06:47 AM 1 12-31-2025 06:54 AM -
My Library Contributions
Subject Likes Author Latest Post 0 1 1 0 0
a week ago
2 Likes
I am updating this old thread so that others who search for this information can find it. The following articles discuss using the Burr XII distribution in SAS:
Implement the Burr distribution in SAS
Fit the Burr distribution in SAS
... View more
2 weeks ago
It looks like you intend this functionality as part of library that others will use.
If so, I encourage you to avoid using ODS _ALL_ CLOSE, which will cause havoc with your users' open ODS destinations. You should use ODS PDF CLOSE only. If you want to exclude the output from other open ODS destinations, then use ODS EXCLUDE ALL prior to writing the PDF file, followed by ODS EXCLUDE NONE when you are ready to resume normal output. Or, if you know the name of the output table(s), you can use ODS EXCLUDE <nameoftable>.
For more details about why ODS CLOSE ALL is a bad programming practice in a public library, see Five reasons to use ODS EXCLUDE to suppress SAS output - The DO Loop.
... View more
3 weeks ago
I think your confusion is that ODS statements are not IML statements.
The ODS statements are GLOBAL statements in SAS. They are similar to TITLE, FOOTNOTE, LIBNAME, FILENAME, etc.
Global statements are never seen by PROC IML! When SAS parses a program, global statements are sent to other parts of SAS to execute. This is true in every procedure. So, for example, in PROC REG, the TITLE and ODS statements are never seen by PROC REG. Those statements are passed to something called the "supervisor" which handles them appropriately.
You might want to read the article "Calling a global statement inside a loop," which discusses the issue. The article discusses using a global statement inside a loop, but the concept of using a global statement inside a function is similar. IML only compiles and store IML statements. Global statements are stripped out and never seen by IML.
The article shows an example that uses CALL EXECUTE, which you are already using.
... View more
a month ago
11 Likes
I believe what you are seeing is that parsing and compiling a SAS program requires memory.
Recall that a macro call is a PREPROCESSOR. It generates test, which are usually SAS statements. In the '%macro loop' example, the macro generates a program that contains 300 MILLION statements. It doesn't matter what the statements are, what matters is that there are 300 million statements.
This huge program is then sent to PROC IML for parsing and execution. The parsing is performed in C. Parsing the program results in a (huge!) C structure that represents the program. That structure is then sent to the procedure for execution. If the amount of memory exceeds the MEMSIZE option in your SAS session, SAS will report an out-of-memory error, which is what is happening in your example.
Your example isn't limited to PROC IML. You can see the same behavior in the DATA step if you use macro to generate a program that contains hundreds of millions of lines. I ran the following macro code to generate huge programs in both the DATA step and PROC IML.
options fullstimer;
options nosource;
%macro IMLtest(numStmts);
%put Test IML: numStmts=%sysfunc(putn(&numStmts,comma9.));
proc iml;
a = 1;
%do i=1 %to &numStmts;
b = a;
%end;
quit;
%mend;
%put ----- START IML TESTS -----;
%IMLtest(10000);
%IMLtest(100000);
%IMLtest(500000);
%IMLtest(1000000);
%macro DStest(numStmts);
%put Test DATA Step: numStmts=%sysfunc(putn(&numStmts,comma9.));
data _NULL_;
a = 1;
%do i=1 %to &numStmts;
b = a;
%end;
run;
%mend;
%put ----- START DATA STEP TESTS -----;
%DStest(10000);
%DStest(100000);
%DStest(500000);
%DStest(1000000);
To make the programs comparable, I use 'a=1' in IML and in the DATA step. If 'a' is a vector in IML, then IML will use a little more memory, but it is really the length of the program (known at compile time) that is causing this issue, not the run-time memory.
On my PC version of SAS, the memory usage for each PROC is as follows:
data VizMemory;
length Proc $10;
input Proc numStmts Memory;
label numStmts="Number of Statements" Memory="Memory (k)";
format numStmts Memory comma10.;
datalines;
DATA_STEP 10000 3907
DATA_STEP 100000 37787
DATA_STEP 500000 180044
DATA_STEP 1000000 359641
IML 1000 254
IML 10000 1428
IML 100000 12698
IML 500000 62705
IML 1000000 125271
;
title "Memory Usage to Parse and Compile SAS Programs";
proc sgplot data=VizMemory;
series x=numStmts y=Memory / group=Proc markers;
xaxis grid;
yaxis grid;
run;
The memory usage scales linearly with the number of SAS statements in the programs. The DATA step actually uses more memory than PROC IML to parse and execute a similar program.
So, in conclusion, you are seeing the result of generating a program that is hundreds of millions of lines long. Those text statements are parsed, which converts the text to a C structure, which requires considerable memory. (Think about a linked list that has 300 million items, and each item has information about the statement and the symbols.) My advice is to use the loops and conditional logic in IML to write your program, rather than using the macro preprocessor to generate repeated statements.
... View more
a month ago
What version of SAS are you using? Run %put &=SYSVLONG; and post the result from the Log.
... View more
a month ago
If you've solved the problem, please select a correct answer and close the thread. If you still need help, please update the thread.
... View more
01-05-2026
03:27 PM
1 Like
There are several ways to organize a library of functions. Back in 2013, I wrote this article: How to create a library of functions in PROC IML - The DO Loop I am going to guess that you have dozens of physical files, which I will call Def1.sas, Def2.sas, Def3.sas, etc. The two main ways to organize the code is
Each file starts with PROC IML, defines the functions, uses the STORE statement to save them, and then QUITs. In this organization, you would %include the relevant files OUTSIDE of your main program, then use LOAD to load the modules inside the main program.
No file contains PROC IML or QUIT, only the module definitions and a STORE statement. To use the functions, you can %include them directly into a program *OR* you can create a driver program that starts with PROC IML, includes the definitions, and then does one QUIT.
I tend to use the second method because I can work on and test components independently. So, here's the way I usually do it:
/* ------ Def1.sas FILE ----- */
start MyMod1(x); ... finish;
start MyMod2(x); ... finish;
start MyMod3(x); ... finish;
store module = (
MyMod1
MyMod2
MyMod3
);
/* ------ Def2.sas FILE ----- */
start MyMod4(x); ... finish;
start MyMod5(x); ... finish;
start MyMod6(x); ... finish;
store module = (
MyMod4
MyMod5
MyMod6
);
/* ------ DefineAll.sas FILE ----- */
%let path = path/to/my/modules;
proc iml;
%INCLUDE "&path/Def1.sas";
%INCLUDE "&path/Def2.sas";
QUIT;
Then, when I want to use the modules, I do this:
%include "DefineAll.sas"; /* stores all modules */
proc iml;
load module=_all_;
y = MyMod1(1234);
z = MyMod6(4321);
... View more
01-05-2026
06:47 AM
4 Likes
When you create an IML library of functions, you start with the SOURCE CODE, which is the set of text files that contains the module definitions. When you use the STORE statement, the source code is compiled and stored in binary form in a SAS catalog. SAS catalogs are not compatible across operating systems or releases, which means that you cannot transfer a catalog compiled on Linux under some SAS release and expect to read it on Windows in another SAS release.
What you should do is transfer the SOURCE CODE files to your PC. Read them into PROC IML and use the STORE statement to create a catalog for your Windows 11 PC.
... View more
12-31-2025
07:24 AM
@Ksharp If the data contained no observations, the OP would get a different ERROR:
ods exclude all;
ods output ParameterEstimates = _index_avg_parms;
proc reg data=sashelp.class(where=(Age>100));
model weight=height;
run;
quit;
ods exclude none;
ERROR: No valid observations are found.
... View more
12-31-2025
06:54 AM
1 Like
It looks like SAS cannot load a standard ODS template (for the NOBS tables). Have you been modifying templates or changing the path for templates? Please run the following statement and post the result:
ods path show;
@WarrenKuhfeld shows how to correctly modify templates in the blog article, "A deeper dive into item stores."
After we see your ODS path, we can probably figure out what is wrong. The fix might be a simpe as
ods path reset;
... View more
12-08-2025
10:35 AM
1 Like
I have never used the %PDMIX800 macro, but you can use PROC PLM and the LSMEANS statement to get a graphical display that reveals which treatment effects are significantly different. Here is the code for your example:
proc mixed data=exercise covtest;/*apply covariacne correction*/
class trt;
model adg=trt iwt/solution ;
store out=mixedout;
run;
ods graphics on;
proc plm restore=mixedout;
lsmeans trt/ pdiff=all adjust=tukey linestable;
run;
The output includes a diffogram by default. The LINESTABLE option produces a table with letters that show which effects are significantly different.
... View more
12-05-2025
10:06 AM
1 Like
> my log-linear model is ln(TOT_COST) = 6.928 + 0.886 ln(TOT_MI). The 0.886 converts to about a one percent change in total miles is associated with in 0.885 percent change total transportation cost. How do I make this into "X number of total miles results in Y change in total costs."
Yes, you can do that, but the log terms require that you report "X number of total miles results in Y change in total costs when X=X0." The nonlinear transformation of X and Y means that there isn't one universal number to report. Instead, the number depends on the value of X at which you want to consider the change. This is in contrast to the percentage method, for which the beta parameter estimate gives the percentage change, as shown in the Cornell notes.
The formula is just calculus, but you need to apply the chain rule because of the log terms. Start with the regression equation for the predicted response: log(Y) = beta_0 + beta_1 * log(X) We seek dY/dX, so take the derivative wrt X of both sides of the equation and apply the chain rule:
1/Y * dY/dX = beta_1 / X Solve this equation for dY/dX: dY/dX = beta_1/X * Y, where Y = exp( beta_0 + beta_1*log(X) ) by solving the regression eqn for Y.
The discrete version of this equation enables you to estimate the change in Y (call it dY) when X changes from X0 to x0 + dX:
dY = dX * beta_1/X0 * exp( beta_0 + beta_1*log(X0) )
So, if last year the district busses ran X0 = 1 million miles and you want to know the cost increase for going 1.01 million miles this year, you would set X0 = 1 million miles, Y0 = exp( beta_0 + beta_1*log(X0) ) dX = 0.01 million miles and compute
dY = estimated change in cost = dX * beta_1/X0 * Y0
... View more
12-04-2025
04:23 PM
Where are the sizes 13034 321 338 77 80 151 109 86 73 36 coming from? The doc discusses how to get the observed and expected numbers for the groups in the H-L test. Please run your model as
proc logistic data=noduleData;
model GT_finding(event='1')= report_score/ lackfit (DF=8 NGROUPS=10) ;
ods select LackFitPartition;
run;
and upload the LackFitPartition table for your data.
... View more
11-26-2025
10:22 AM
We saw this issue in 2024, but I was not involved in investigating the problem. Perhaps @SAS_Rob has information about the cause or resolution.
... View more