BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
alepage
Barite | Level 11

Hello,

 

I am using a do loop statement to call a macro function which need the variable k to be passed as below

 

%macro mymacro;

%do k=1 %to 1000;

some instructions...

%end;

%mend mymacro;

%mymacro;

 

How can I replace the do loop statement by call execute.

Could you please provide me a simple example.

regards,

 

1 ACCEPTED SOLUTION

Accepted Solutions
RW9
Diamond | Level 26 RW9
Diamond | Level 26
data _null_;
  do j=1 to 1000;
    call execute(cats('%your_macro (param1=',put(j,best.),');'));
  end;
run;

The real question here however is why your doing that in the first place.  Whenever I see loops over data and macro I can immediately tell its a bad process.  I expect your either a) splitting data up before this step and then creating all this looping to handle it, or b) splitting the data up in this step.  Its rarely a good idea to split like data, a datastep is already a loop and has powerfull by group processing for doing grouping - inbuilt so you don't need lots of messy code to emulate it!

 

 

View solution in original post

2 REPLIES 2
RW9
Diamond | Level 26 RW9
Diamond | Level 26
data _null_;
  do j=1 to 1000;
    call execute(cats('%your_macro (param1=',put(j,best.),');'));
  end;
run;

The real question here however is why your doing that in the first place.  Whenever I see loops over data and macro I can immediately tell its a bad process.  I expect your either a) splitting data up before this step and then creating all this looping to handle it, or b) splitting the data up in this step.  Its rarely a good idea to split like data, a datastep is already a loop and has powerfull by group processing for doing grouping - inbuilt so you don't need lots of messy code to emulate it!

 

 

alepage
Barite | Level 11

Hello,

 

Among my tasks, I have developed a program to validate SAS dataset in term of structure, then in term of values (constraints).

 

After that, I need to convert my validated data set into an xml file.  However, we have external data set of about 500 variables and 35 millions of observations. 

 

The conversion is very too long. I have made some test and I found that fragmenting the dataset into smaller data set then converting them permit to reduce the process time, let's say of about 40%.  It still remain to long

 

In fact to convert a dataset of 480 variables and 25000 observations, it takes about 3 minutes.  It is again to long.

 

I wrote to SAS to ask them if they could improve their xmlv2 engine but without success.

 

Thereafter, I have look for the parallel processing approach, and my preliminary test with other dataset permit me to reduce again the process time of about 50%.

 

So I have elaborate a project which read 4 datasets of various size, and convert those into xml file (serial process). After that, we read the same datasets and convert those into xml file using parallel processing. we observe gains in terms of processing time (-50%).

 

It was suggested to use call execute to correct the error observed in the project.

 

After that, I would like to test the parallel approach with a large fragmented data set to see the gain and if this approach could be useful for our project.

 

I let you know if the bench test project is working.

regards,

 

 

hackathon24-white-horiz.png

2025 SAS Hackathon: There is still time!

Good news: We've extended SAS Hackathon registration until Sept. 12, so you still have time to be part of our biggest event yet – our five-year anniversary!

Register Now

Creating Custom Steps in SAS Studio

Check out this tutorial series to learn how to build your own steps in SAS Studio.

Find more tutorials on the SAS Users YouTube channel.

SAS Training: Just a Click Away

 Ready to level-up your skills? Choose your own adventure.

Browse our catalog!

Discussion stats
  • 2 replies
  • 1374 views
  • 2 likes
  • 2 in conversation