Vince28@Statcan wrote:
call symput statements create "piles" of macro statements that are only compiled after the execution of a given data step.
this statment is not correct.
CALL SYMPUT does its work when it is executed. You may be confusing CALL EXECUTE with SYMPUT but that statement is not exactly true for EXECUTE either.
Indeed. Vince has it roughly backwards in his explanation - the issue is not that CALL SYMPUT is doing something at the wrong time, but that the use of the macro variable is often not done at the right time. IE, in this data step:
data test;
set sashelp.class;
call symput('age',age);
x_&age. = age;
run;
This doesn't work, not because the call symput runs after the x_&age statement, but because x_&age. needs to be able to set up during the compilation stage; CALL EXECUTE and the actual x_&age. assignment happen in the same timing.
On the other hand,
data test;
set sashelp.class;
call symput('age',age);
array x_age[20];
x_age[symget('age')] = age;
run;
This works because you don't need to know the value of &age until execution time.
I stand corrected.
I'd like to add something to the thread. I was thinking back to your current implementation Emre and I think you would gain a lot of processing time if, instead of throwing that read_array within your mmult routine, you created a second routine just to load your copula matrix in a data step _temporary_ array and then passed that array to the mmult fcmp function.
I am almost positive that each mmult call does the read_array which requires an I/O operation for each of the 100K rows that could be saved by passing the array to the fcmp function instead. Again, I am not 100% sure about how fcmp and routine calls are handled in the data step. Maybe SAS optimized it so that static elements are read and kept in memory but I highly doubt it. Plus, seeing as I/O operations is really what was prohibitive in your original approach, this additional fix is in line with everything else that came with my original response.
I hope you get what I'm trying to say, if not and if you still care to further improve the algo, I can modify the code and post back at a later point in time.
Vince,
I do know what you mean. I thought of implementing that originally but 2 minutes is well within my limit at the moment so it's a little lower on my list of things to do with the model.
Thanks a lot. You've really helped me out.
Emre
feyzi wrote:
Hi Vince,
Sorry sure thing.
So I have 100,000 simulations of z values (from a standard normal distribution) for two categories (A, B), which look like this:
LOB | SIM | Z |
A | 1 | -0.32659 |
B | 1 | 0.450515 |
A | 2 | 1.542437 |
B | 2 | 0.396981 |
A | 3 | 0.259029 |
B | 3 | 0.045685 |
A | 4 | -0.55583 |
B | 4 | 0.728347 |
A | 5 | 0.77872 |
B | 5 | -1.27385 |
I want to carry out a matrix multiplication of each of these simulations and the following correlation matrix table so I get correlated variables:
A | B |
1 | 0.5 |
0.5 | 1 |
Seems like PROC SCORE might be what you want.
Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!
Learn the difference between classical and Bayesian statistical approaches and see a few PROC examples to perform Bayesian analysis in this video.
Find more tutorials on the SAS Users YouTube channel.