<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic collapsing data into episode level file in SAS Programming</title>
    <link>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44376#M9084</link>
    <description>Hello all:&lt;BR /&gt;
&lt;BR /&gt;
I cannot figure out how to do this...&lt;BR /&gt;
&lt;BR /&gt;
I have a batch of claims data that look like this:&lt;BR /&gt;
 &lt;BR /&gt;
id   from   thru&lt;BR /&gt;
      date date    cpt          severity&lt;BR /&gt;
1	1	1	99281	1&lt;BR /&gt;
1	1	1	99282	3&lt;BR /&gt;
1	1	2	99283	1&lt;BR /&gt;
1	1	3	99284	1&lt;BR /&gt;
1	1	4	99284	2&lt;BR /&gt;
1	2	3	99285	2&lt;BR /&gt;
1	2	4	99285	1&lt;BR /&gt;
1	3	4	99283	2&lt;BR /&gt;
2	1	2	99284	3&lt;BR /&gt;
2	2	3	99285	2&lt;BR /&gt;
3	3	4	99285	1&lt;BR /&gt;
4	1	1	99281	1&lt;BR /&gt;
4	2	2	99281	3&lt;BR /&gt;
4	3	3	99281	2&lt;BR /&gt;
5	1	2	99282	3&lt;BR /&gt;
6	1	1	99283	2&lt;BR /&gt;
7	1	1	99282	1&lt;BR /&gt;
7	1	2	99282	3&lt;BR /&gt;
7	1	3	99283	2&lt;BR /&gt;
7	1	4	99284	1&lt;BR /&gt;
8	1	1	99285	2&lt;BR /&gt;
etc.&lt;BR /&gt;
&lt;BR /&gt;
I am supposed to aggregate this into an episode level file that allows for a maximum of three days between from and through date, while retaining all the other information in the claims, but only if dates overlap in claims.  So, for example, all of the claims for id=1 would be collapsed into a single episode of care, because all the dates fall into the range 1-4. But the claims for id=4 would generate 3 episodes, because the dates between services are distinct.  And I have to save all the information from all the claims regardless of how many distinct episodes are created - original from and thru dates, cpt codes, and severity levels.&lt;BR /&gt;
&lt;BR /&gt;
Any ideas?&lt;BR /&gt;
&lt;BR /&gt;
Thanks a lot!!!&lt;BR /&gt;
bethcook</description>
    <pubDate>Mon, 08 Sep 2008 22:57:38 GMT</pubDate>
    <dc:creator>deleted_user</dc:creator>
    <dc:date>2008-09-08T22:57:38Z</dc:date>
    <item>
      <title>collapsing data into episode level file</title>
      <link>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44376#M9084</link>
      <description>Hello all:&lt;BR /&gt;
&lt;BR /&gt;
I cannot figure out how to do this...&lt;BR /&gt;
&lt;BR /&gt;
I have a batch of claims data that look like this:&lt;BR /&gt;
 &lt;BR /&gt;
id   from   thru&lt;BR /&gt;
      date date    cpt          severity&lt;BR /&gt;
1	1	1	99281	1&lt;BR /&gt;
1	1	1	99282	3&lt;BR /&gt;
1	1	2	99283	1&lt;BR /&gt;
1	1	3	99284	1&lt;BR /&gt;
1	1	4	99284	2&lt;BR /&gt;
1	2	3	99285	2&lt;BR /&gt;
1	2	4	99285	1&lt;BR /&gt;
1	3	4	99283	2&lt;BR /&gt;
2	1	2	99284	3&lt;BR /&gt;
2	2	3	99285	2&lt;BR /&gt;
3	3	4	99285	1&lt;BR /&gt;
4	1	1	99281	1&lt;BR /&gt;
4	2	2	99281	3&lt;BR /&gt;
4	3	3	99281	2&lt;BR /&gt;
5	1	2	99282	3&lt;BR /&gt;
6	1	1	99283	2&lt;BR /&gt;
7	1	1	99282	1&lt;BR /&gt;
7	1	2	99282	3&lt;BR /&gt;
7	1	3	99283	2&lt;BR /&gt;
7	1	4	99284	1&lt;BR /&gt;
8	1	1	99285	2&lt;BR /&gt;
etc.&lt;BR /&gt;
&lt;BR /&gt;
I am supposed to aggregate this into an episode level file that allows for a maximum of three days between from and through date, while retaining all the other information in the claims, but only if dates overlap in claims.  So, for example, all of the claims for id=1 would be collapsed into a single episode of care, because all the dates fall into the range 1-4. But the claims for id=4 would generate 3 episodes, because the dates between services are distinct.  And I have to save all the information from all the claims regardless of how many distinct episodes are created - original from and thru dates, cpt codes, and severity levels.&lt;BR /&gt;
&lt;BR /&gt;
Any ideas?&lt;BR /&gt;
&lt;BR /&gt;
Thanks a lot!!!&lt;BR /&gt;
bethcook</description>
      <pubDate>Mon, 08 Sep 2008 22:57:38 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44376#M9084</guid>
      <dc:creator>deleted_user</dc:creator>
      <dc:date>2008-09-08T22:57:38Z</dc:date>
    </item>
    <item>
      <title>Re: collapsing data into episode level file</title>
      <link>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44377#M9085</link>
      <description>Hi,&lt;BR /&gt;
If your data isn't huge, consider to read it twice. The first round to identify episodes using retain on from and thru dates. The second round to output your aggregate.&lt;BR /&gt;
&lt;BR /&gt;
/Linus</description>
      <pubDate>Tue, 09 Sep 2008 06:28:06 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44377#M9085</guid>
      <dc:creator>LinusH</dc:creator>
      <dc:date>2008-09-09T06:28:06Z</dc:date>
    </item>
    <item>
      <title>Re: collapsing data into episode level file</title>
      <link>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44378#M9086</link>
      <description>Hi bethcook&lt;BR /&gt;
&lt;BR /&gt;
That's a nice one!&lt;BR /&gt;
&lt;BR /&gt;
I see 2 issues here: &lt;BR /&gt;
1. How to identify episodes.&lt;BR /&gt;
2. How to organise the data.&lt;BR /&gt;
&lt;BR /&gt;
Question to 1:&lt;BR /&gt;
id fromdate todate&lt;BR /&gt;
1  1        4 &lt;BR /&gt;
1  3        6&lt;BR /&gt;
1  5        8&lt;BR /&gt;
&lt;BR /&gt;
How many episodes would that be and which record would belong to which episode?&lt;BR /&gt;
&lt;BR /&gt;
&lt;BR /&gt;
Thoughts about 2:&lt;BR /&gt;
Your dataset can either be aggregated or on atomic level. To have both you can:&lt;BR /&gt;
A: have 2 datasets; generate a primary key for the aggregated dataset and add this key to the atomic dataset (the one you have already) as foreign key.&lt;BR /&gt;
B: create only one aggregated dataset and store all atomic information de-normalised into "arrays".&lt;BR /&gt;
C: keep the atomic dataset (number of rows) but add additional variables with aggregated values which are repeated (also "de-normalised").&lt;BR /&gt;
&lt;BR /&gt;
I would go for variant A or C.&lt;BR /&gt;
 &lt;BR /&gt;
Variant B will give you a lot of headache (and cause some macro coding). &lt;BR /&gt;
The number of variables to store the atomic information (=size of the array) would vary (maximum number of rows which make one episode).&lt;BR /&gt;
&lt;BR /&gt;
&lt;BR /&gt;
Let me know what you're heading for and answer question 1 please.&lt;BR /&gt;
&lt;BR /&gt;
Regards&lt;BR /&gt;
Patrick</description>
      <pubDate>Tue, 09 Sep 2008 11:20:07 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44378#M9086</guid>
      <dc:creator>Patrick</dc:creator>
      <dc:date>2008-09-09T11:20:07Z</dc:date>
    </item>
    <item>
      <title>Re: collapsing data into episode level file</title>
      <link>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44379#M9087</link>
      <description>in response to question 1, I think I would probably treat that as 3 distinct episodes even though there is overlap, but I would want to verify that with my team first.&lt;BR /&gt;
&lt;BR /&gt;
I appreciate your help!  What do you mean exactly by "atomic" - I am not very good at SAS yet...&lt;BR /&gt;
Thanks again&lt;BR /&gt;
bethcook</description>
      <pubDate>Tue, 09 Sep 2008 12:04:58 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44379#M9087</guid>
      <dc:creator>deleted_user</dc:creator>
      <dc:date>2008-09-09T12:04:58Z</dc:date>
    </item>
    <item>
      <title>Re: collapsing data into episode level file</title>
      <link>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44380#M9088</link>
      <description>To 1:&lt;BR /&gt;
Just make sure to create test data to cover all possible cases - and then define the rules to build episodes before you even start with coding.&lt;BR /&gt;
&lt;BR /&gt;
To 2:&lt;BR /&gt;
Atomic is a term used in data organisation (not SAS specific). &lt;BR /&gt;
Out of a theory book: "Granularity refers to the level of detail represented by the values stored in a table's row. Data stored at their lowest level of granularity are said to be atomic data."&lt;BR /&gt;
&lt;BR /&gt;
Let us know the rules when you've got them and provide also the test data.&lt;BR /&gt;
&lt;BR /&gt;
Patrick</description>
      <pubDate>Thu, 11 Sep 2008 11:05:16 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44380#M9088</guid>
      <dc:creator>Patrick</dc:creator>
      <dc:date>2008-09-11T11:05:16Z</dc:date>
    </item>
    <item>
      <title>Re: collapsing data into episode level file</title>
      <link>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44381#M9089</link>
      <description>Patrick:&lt;BR /&gt;
Thanks for your suggestions.  They helped - I used one of them to guide me somewhat - &lt;BR /&gt;
1 - Defined claims as "singles" "possible parents" and "children" - the first two at an episode level, the last at claim level - had to do "possible" parents because I could not use information from the previous observation to determine if they had a "child"&lt;BR /&gt;
2 - Reorganized "children" into an episode level file&lt;BR /&gt;
3 - Merged "possible parents" back onto "children" - if they found a match they were confirmed as "parents" - otherwise "singles"&lt;BR /&gt;
4 - Put singles, parents &amp;amp; children together.&lt;BR /&gt;
&lt;BR /&gt;
Thanks again!</description>
      <pubDate>Thu, 11 Sep 2008 13:44:13 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/collapsing-data-into-episode-level-file/m-p/44381#M9089</guid>
      <dc:creator>deleted_user</dc:creator>
      <dc:date>2008-09-11T13:44:13Z</dc:date>
    </item>
  </channel>
</rss>

