BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
camfarrell25
Quartz | Level 8

I've got the following code that creates 30 datasets (6 * 5 channels). 

I've got a loop going that creates all the datasets as I wish but now I'm stuck trying to find an efficient way to combine all the

sets without having to write it all out... 

Any suggestion???

 

PROC SQL;
CREATE TABLE CHANNEL 
(channel char(8));
insert into channel 
values ("ALL")
values ("ChannelF")
values ("ChannelG")
values ("ChannelH")
values ("ChannelI");

PROC SQL noprint;
select channel
into :channel1 - :channel5 notrim
from channel;
quit;

%macro putit;
%do i=1 %to i=5;
%put Row&i: Channel = &&channel&i;
%end;
%mend putit;
%putit


%macro average;
%do i=1 %to 5;
/*Canada - Received */
PROC SQL;
CREATE TABLE AVGRECCAN_&&channel&i AS SELECT
REPORTING, YRQRT, "CANADA" AS REGIONID, "RECEIVED" AS CATEGORY, GROUP,
AVG(bestscore) as AVG_SCORE
FROM &&channel&i
GROUP BY REPORTING, YRQRT, GROUP;
QUIT;
/*Toronto - Received */
PROC SQL;
CREATE TABLE AVGRECTORONTO_&&channel&i AS SELECT
REPORTING, YRQRT, "TORONTO" AS REGIONID, "RECEIVED" AS CATEGORY, GROUP,
AVG(bestscore) as AVG_SCORE
FROM &&channel&i
WHERE TORONTOIND ='TORONTO'
GROUP BY REPORTING, YRQRT, GROUP;
QUIT;
/*Provincial - Received*/
PROC SQL;
CREATE TABLE AVGRECPROV_&&channel&i AS SELECT
REPORTING, YRQRT, REGION AS REGIONID, "RECEIVED" AS CATEGORY, GROUP,
AVG(bestscore) as AVG_SCORE
FROM &&channel&i
GROUP BY REPORTING, YRQRT, GROUP,REGION ;
QUIT;
/*Canada - Approved*/
PROC SQL;
CREATE TABLE AVGAPPCAN_&&channel&i AS SELECT
REPORTING, YRQRT, "CANADA" AS REGIONID, "APPROVED" AS CATEGORY, GROUP,
AVG(bestscore) as AVG_SCORE
FROM &&channel&i
WHERE STATUS_IND = "APPROVED"
GROUP BY REPORTING, YRQRT, GROUP;
QUIT;
/*Toronto - Approved*/
PROC SQL;
CREATE TABLE AVGAPPTORONTO_&&channel&i AS SELECT
REPORTING, YRQRT, "TORONTO" AS REGIONID, "APPROVED" AS CATEGORY, GROUP,
AVG(bestscore) as AVG_SCORE
FROM &&channel&i
WHERE TORONTOIND ='TORONTO' AND STATUS_IND = "APPROVED"
GROUP BY REPORTING, YRQRT, GROUP;
QUIT;
/*Provincial - Approved*/
PROC SQL;
CREATE TABLE AVGAPPPROV_&&channel&i AS SELECT
REPORTING, YRQRT, REGION AS REGIONID, "APPROVED" AS CATEGORY, GROUP,
AVG(bestscore) as AVG_SCORE
FROM &&channel&i
WHERE STATUS_IND = "APPROVED"
GROUP BY REPORTING, YRQRT, GROUP,REGION ;
QUIT;

%end;
%mend average;
%average;
1 ACCEPTED SOLUTION

Accepted Solutions
Reeza
Super User

Use a naming convention then you can use the colon wildcard to append all that start with the same prefix. 

For example this appends all tables that start with RECEIVED_. 

 

 

Data want;

set received_:;

run;

View solution in original post

2 REPLIES 2
Reeza
Super User

Use a naming convention then you can use the colon wildcard to append all that start with the same prefix. 

For example this appends all tables that start with RECEIVED_. 

 

 

Data want;

set received_:;

run;

RW9
Diamond | Level 26 RW9
Diamond | Level 26

Why are you doing that code at all?  I can't really tell what your underlying data is, but creating 6 * 5 datasets and then combining them seems to be mad.  Why create all those datasets, and then do exactly the same process on each of them and then combine them?  Why not combine your data in a useable model, then run the code once on that one dataset, using a by group if necessary:

proc means data=allcombined;
  var avar;
  output out=want mean=mean;
run;

sas-innovate-2024.png

Available on demand!

Missed SAS Innovate Las Vegas? Watch all the action for free! View the keynotes, general sessions and 22 breakouts on demand.

 

Register now!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 2 replies
  • 2658 views
  • 0 likes
  • 3 in conversation