<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Aggregating data by week in SAS Programming</title>
    <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767180#M243185</link>
    <description>&lt;P&gt;You have yet to fully describe your problem. For example, what definition of week do you want to use?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;SAS provides at least three different definitions &lt;A href="https://documentation.sas.com/doc/en/pgmsascdc/9.4_3.5/lefunctionsref/n1ka2ulrvrjlasn0z7beco2yrgas.htm" target="_blank" rel="noopener"&gt;here&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Also you say you have two years of data. That will mean at least 104 weekly columns in wide format. What happens at year boundaries?&lt;/P&gt;</description>
    <pubDate>Sat, 11 Sep 2021 04:33:06 GMT</pubDate>
    <dc:creator>SASKiwi</dc:creator>
    <dc:date>2021-09-11T04:33:06Z</dc:date>
    <item>
      <title>Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766711#M243007</link>
      <description>&lt;P&gt;I have a daily transactional data for 2 years and I need to have a summary view like below. The status code is always "B" and the Memo code is always "M"&lt;/P&gt;&lt;P&gt;Status represents initial status of the ID. Memo represents final status of the ID&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Data looks like:&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;ID&lt;/TD&gt;&lt;TD&gt;Status&lt;/TD&gt;&lt;TD&gt;Status Date&lt;/TD&gt;&lt;TD&gt;Memo&lt;/TD&gt;&lt;TD&gt;Memo_Date&lt;/TD&gt;&lt;TD&gt;Status Week&lt;/TD&gt;&lt;TD&gt;Memo_Week&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;01-Jan&lt;/TD&gt;&lt;TD&gt;M&lt;/TD&gt;&lt;TD&gt;01-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;01-Jan&lt;/TD&gt;&lt;TD&gt;M&lt;/TD&gt;&lt;TD&gt;02-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;01-Jan&lt;/TD&gt;&lt;TD&gt;M&lt;/TD&gt;&lt;TD&gt;02-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;01-Jan&lt;/TD&gt;&lt;TD&gt;M&lt;/TD&gt;&lt;TD&gt;10-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 2&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;02-Jan&lt;/TD&gt;&lt;TD&gt;M&lt;/TD&gt;&lt;TD&gt;10-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 2&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;6&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;02-Jan&lt;/TD&gt;&lt;TD&gt;M&lt;/TD&gt;&lt;TD&gt;14-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;7&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;02-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;8&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;02-Jan&lt;/TD&gt;&lt;TD&gt;M&lt;/TD&gt;&lt;TD&gt;03-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;9&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;10-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 2&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;10&lt;/TD&gt;&lt;TD&gt;B&lt;/TD&gt;&lt;TD&gt;10-Jan&lt;/TD&gt;&lt;TD&gt;M&lt;/TD&gt;&lt;TD&gt;14-Jan&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 2&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 3&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;What i need to do is summarize like this below. the intent is to show the % of status B from a given week that move to status B in week 1,2,3 etc.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;ID count&lt;/TD&gt;&lt;TD&gt;Memo_Week_1&lt;/TD&gt;&lt;TD&gt;Memo_Week_2&lt;/TD&gt;&lt;TD&gt;Memo_Week_3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&amp;nbsp;1&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;8&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&amp;nbsp; 2&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;What i am thinking of doing is writing a proc sql like below&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;select status_week, count(ID) as ID_ count, case when (memo_week = 1 then ID end) as Memo_Week_1,&lt;/P&gt;&lt;P&gt;case when (memo_week = 2 then ID end) as Memo_Week_2&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;and so forth. But this mean i have to hardcode all 52 weeks. There has to be a more efficient way to do this?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Sep 2021 21:58:46 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766711#M243007</guid>
      <dc:creator>TheNovice</dc:creator>
      <dc:date>2021-09-08T21:58:46Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766720#M243010</link>
      <description>&lt;P&gt;If you had actual years (01-Jan-2021) along with day and month, then you can use a Week format, and everything is simple. And make your output vertical (LONG) rather than horizontal (WIDE).&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;If you absolutely don't have years, the easiest thing to do is still to add a year, but really, you should have years as part of any date.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Example:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data fakedata;
     input id status_date :date9. memo_date :date9.;
     format status_date memo_date weeku6.;
     cards;
 1 01jan2021 01jan2021
 2 09jan2021 13jan2021
 3 04jan2021 01feb2021
 4 16jan2021 03feb2021
 ;
 
 proc freq data=fakedata;
     tables status_date memo_date/nocum nopercent;
 run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Sep 2021 22:32:46 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766720#M243010</guid>
      <dc:creator>PaigeMiller</dc:creator>
      <dc:date>2021-09-08T22:32:46Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766739#M243016</link>
      <description>&lt;P&gt;Thank you Paige,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could you explain how the solution will lead me to the desired output? The transactional data is about 25 million records for 2 years.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;Total ID count&lt;/TD&gt;&lt;TD&gt;Memo_Week_1&lt;/TD&gt;&lt;TD&gt;Memo_Week_2&lt;/TD&gt;&lt;TD&gt;Memo_Week_3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;Status week1&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;8&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;Status week 2&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;</description>
      <pubDate>Thu, 09 Sep 2021 01:10:30 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766739#M243016</guid>
      <dc:creator>TheNovice</dc:creator>
      <dc:date>2021-09-09T01:10:30Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766742#M243017</link>
      <description>&lt;P&gt;Do you really want 52 columns in the output?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The code I gave was an example of using week formats, it wasn't designed to produce the exact output, it was designed to hopefully get you started.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;You could probably use PROC FREQ to get closer to your desired output.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;proc freq data=have;
    tables status_date*memo_date/nocol norow nopercent;
    format status_date memo_date weeku6.;
run;&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Thu, 09 Sep 2021 11:26:25 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766742#M243017</guid>
      <dc:creator>PaigeMiller</dc:creator>
      <dc:date>2021-09-09T11:26:25Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766749#M243023</link>
      <description>Hi Paige, yes I would need 52 columns. I can't think of another way to correlate the initial status volume to the change in status through the weeks. The proc freq would only get me there partly... It is still missing the total ID count for each status week.&lt;BR /&gt;&lt;BR /&gt;So, I am trying this, but was wondering if there is a faster method to get to the same kind of output&lt;BR /&gt;&lt;BR /&gt;PROC SQL;&lt;BR /&gt;CREATE TABLE TT2 AS&lt;BR /&gt;SELECT Year,Status_WEEK,COUNT(ID) AS TOTAL,&lt;BR /&gt;CASE WHEN (MEMO_WEEK = 1 AND MEMO_TYPE = 'M' THEN ID END) AS MEMO_WEEK_1,&lt;BR /&gt;CASE WHEN (MEMO_WEEK = 2 AND MEMO_TYPE = 'M' THEN ID END) AS MEMO_WEEK_2,&lt;BR /&gt;CASE WHEN (MEMO_WEEK = 3 AND MEMO_TYPE = 'M' THEN ID END) AS MEMO_WEEK_3,&lt;BR /&gt;... AND SO ON FOR ALL 52 WEEKS&lt;BR /&gt;from tt1&lt;BR /&gt;group by Year,Status_week;&lt;BR /&gt;quit;</description>
      <pubDate>Thu, 09 Sep 2021 01:47:34 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766749#M243023</guid>
      <dc:creator>TheNovice</dc:creator>
      <dc:date>2021-09-09T01:47:34Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766770#M243032</link>
      <description>&lt;P&gt;A simple PROC REPORT:&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;proc report data=have;
column status_week n,memo_week;
define status_week / group;
define memo_week / across;
run;&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Thu, 09 Sep 2021 06:34:15 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766770#M243032</guid>
      <dc:creator>Kurt_Bremser</dc:creator>
      <dc:date>2021-09-09T06:34:15Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766813#M243048</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;HR /&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/248692"&gt;@TheNovice&lt;/a&gt;&amp;nbsp;wrote:&lt;BR /&gt;Hi Paige, yes I would need 52 columns. I can't think of another way to correlate the initial status volume to the change in status through the weeks. The proc freq would only get me there partly... It is still missing the total ID count for each status week.&lt;BR /&gt;&lt;HR /&gt;&lt;/BLOCKQUOTE&gt;
&lt;P&gt;Remove the NOROW option in PROC FREQ.&lt;/P&gt;</description>
      <pubDate>Thu, 09 Sep 2021 11:18:52 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/766813#M243048</guid>
      <dc:creator>PaigeMiller</dc:creator>
      <dc:date>2021-09-09T11:18:52Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767179#M243184</link>
      <description>&lt;P&gt;Hi Paige,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you but that is not the solution i need. That will give a total count of the ID. I only need a count of the ID where memo_date is not blank&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am trying to show how much of the initial count resolved by subsequent weeks.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 11 Sep 2021 04:00:39 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767179#M243184</guid>
      <dc:creator>TheNovice</dc:creator>
      <dc:date>2021-09-11T04:00:39Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767180#M243185</link>
      <description>&lt;P&gt;You have yet to fully describe your problem. For example, what definition of week do you want to use?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;SAS provides at least three different definitions &lt;A href="https://documentation.sas.com/doc/en/pgmsascdc/9.4_3.5/lefunctionsref/n1ka2ulrvrjlasn0z7beco2yrgas.htm" target="_blank" rel="noopener"&gt;here&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Also you say you have two years of data. That will mean at least 104 weekly columns in wide format. What happens at year boundaries?&lt;/P&gt;</description>
      <pubDate>Sat, 11 Sep 2021 04:33:06 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767180#M243185</guid>
      <dc:creator>SASKiwi</dc:creator>
      <dc:date>2021-09-11T04:33:06Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767182#M243186</link>
      <description>&lt;P&gt;Hi Kurt,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for the response. I modified the query to look like this:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But I need the count(id) in the ID column&lt;/P&gt;&lt;P&gt;count(ID when memo_date is not null) in the memo weeks&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is there a way to apply a where statement just to the memo columns&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;proc report data=cure2;&lt;BR /&gt;column status_week n,idn ,memo_week;&lt;BR /&gt;define status_week / group;&lt;BR /&gt;define id / n;&lt;BR /&gt;define memo_week / across;&lt;/P&gt;&lt;P&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 11 Sep 2021 04:49:29 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767182#M243186</guid>
      <dc:creator>TheNovice</dc:creator>
      <dc:date>2021-09-11T04:49:29Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767183#M243187</link>
      <description>&lt;P&gt;I am using the week function. Using the latest suggestion from Kurt my output looks like this. It's almost what i need. the ID count is the total count but the count in the memo_week column needs to be for ID where the memo_Date is not null&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;There are x number of Id's that hit status B in a given week and then some move to status M in subsequent weeks. I am trying to show that distribution by week against the initial total. Not all Id's move into status M&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;n&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;n&lt;/TD&gt;&lt;TD&gt;MEMO_WEEK&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;STATUS_WEEK&lt;/TD&gt;&lt;TD&gt;id&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;6&lt;/TD&gt;&lt;TD&gt;7&lt;/TD&gt;&lt;TD&gt;8&lt;/TD&gt;&lt;TD&gt;9&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;56954&lt;/TD&gt;&lt;TD&gt;1004&lt;/TD&gt;&lt;TD&gt;33579&lt;/TD&gt;&lt;TD&gt;9354&lt;/TD&gt;&lt;TD&gt;5128&lt;/TD&gt;&lt;TD&gt;2923&lt;/TD&gt;&lt;TD&gt;1353&lt;/TD&gt;&lt;TD&gt;734&lt;/TD&gt;&lt;TD&gt;826&lt;/TD&gt;&lt;TD&gt;460&lt;/TD&gt;&lt;TD&gt;302&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;268372&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;82860&lt;/TD&gt;&lt;TD&gt;93011&lt;/TD&gt;&lt;TD&gt;37385&lt;/TD&gt;&lt;TD&gt;19232&lt;/TD&gt;&lt;TD&gt;12206&lt;/TD&gt;&lt;TD&gt;4428&lt;/TD&gt;&lt;TD&gt;4373&lt;/TD&gt;&lt;TD&gt;3376&lt;/TD&gt;&lt;TD&gt;2145&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;199854&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;50941&lt;/TD&gt;&lt;TD&gt;80656&lt;/TD&gt;&lt;TD&gt;26287&lt;/TD&gt;&lt;TD&gt;14966&lt;/TD&gt;&lt;TD&gt;8869&lt;/TD&gt;&lt;TD&gt;3139&lt;/TD&gt;&lt;TD&gt;3455&lt;/TD&gt;&lt;TD&gt;2835&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;222380&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;62169&lt;/TD&gt;&lt;TD&gt;82358&lt;/TD&gt;&lt;TD&gt;30440&lt;/TD&gt;&lt;TD&gt;14696&lt;/TD&gt;&lt;TD&gt;10530&lt;/TD&gt;&lt;TD&gt;4194&lt;/TD&gt;&lt;TD&gt;4085&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;257609&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;.&lt;/TD&gt;&lt;TD&gt;49375&lt;/TD&gt;&lt;TD&gt;114797&lt;/TD&gt;&lt;TD&gt;34623&lt;/TD&gt;&lt;TD&gt;19441&lt;/TD&gt;&lt;TD&gt;14087&lt;/TD&gt;&lt;TD&gt;5197&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;</description>
      <pubDate>Sat, 11 Sep 2021 04:56:21 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767183#M243187</guid>
      <dc:creator>TheNovice</dc:creator>
      <dc:date>2021-09-11T04:56:21Z</dc:date>
    </item>
    <item>
      <title>Re: Aggregating data by week</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767223#M243208</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;HR /&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/248692"&gt;@TheNovice&lt;/a&gt;&amp;nbsp;wrote:&lt;BR /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;that is not the solution i need. That will give a total count of the ID. &lt;FONT color="#FF0000"&gt;I only need a count of the ID where memo_date is not blank&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am trying to show how much of the initial count resolved by subsequent weeks.&amp;nbsp;&lt;/P&gt;
&lt;HR /&gt;&lt;/BLOCKQUOTE&gt;
&lt;P&gt;As far as I know, the PROC FREQ code I showed in my &lt;U&gt;2nd&lt;/U&gt; reply does exactly this. When either variable is missing, that record is not used in the table.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;If you disagree, produce a small example and show us why the PROC FREQ code does not give you what you want; and make clear exactly what you do want&lt;/P&gt;</description>
      <pubDate>Sun, 12 Sep 2021 19:16:48 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Aggregating-data-by-week/m-p/767223#M243208</guid>
      <dc:creator>PaigeMiller</dc:creator>
      <dc:date>2021-09-12T19:16:48Z</dc:date>
    </item>
  </channel>
</rss>

