<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>SAS Data Management topics</title>
    <link>https://communities.sas.com/t5/SAS-Data-Management/bd-p/data_management</link>
    <description>SAS Data Management topics</description>
    <pubDate>Tue, 11 Aug 2020 21:29:56 GMT</pubDate>
    <dc:creator>data_management</dc:creator>
    <dc:date>2020-08-11T21:29:56Z</dc:date>
    <item>
      <title>Merging rows based on date and location</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Merging-rows-based-on-date-and-location/m-p/675973#M19479</link>
      <description>&lt;P&gt;Trying to merge rows in which an individual was at the same location but has multiple date ranges resulting in multiple rows. Want to have one row per location with the first start date and last end date. Having a blank end date means they are still at that location and that would need to be kept.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I don't even know where to start with this merge. Open to DATA and PROC SQL steps. This data has approximately 500 rows currently for approximately 60 individuals. The reason the data populates like this is because there are separate bed assignments for each row, that level of data is not needed for this output, just location and the dates.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;What my data currently looks like:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;FULLNAME&lt;/TD&gt;&lt;TD&gt;START&lt;/TD&gt;&lt;TD&gt;END&lt;/TD&gt;&lt;TD&gt;LOCATION&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;14Sep2010&lt;/TD&gt;&lt;TD&gt;15Sep2010&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;15Sep2010&lt;/TD&gt;&lt;TD&gt;28Sep2010&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;28Sep2010&lt;/TD&gt;&lt;TD&gt;27Oct2010&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;28Oct2010&lt;/TD&gt;&lt;TD&gt;20Apr2011&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;20Apr2011&lt;/TD&gt;&lt;TD&gt;02May2011&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;02May2011&lt;/TD&gt;&lt;TD&gt;17May2011&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;17May2011&lt;/TD&gt;&lt;TD&gt;18Jul2011&lt;/TD&gt;&lt;TD&gt;PINE-A&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;18Jul2011&lt;/TD&gt;&lt;TD&gt;03Oct2011&lt;/TD&gt;&lt;TD&gt;PINE-A&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;03Oct2011&lt;/TD&gt;&lt;TD&gt;13Dec2011&lt;/TD&gt;&lt;TD&gt;PINE-A&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;13Dec2011&lt;/TD&gt;&lt;TD&gt;31Jan2012&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;31Jan2012&lt;/TD&gt;&lt;TD&gt;08Feb2012&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;08Feb2012&lt;/TD&gt;&lt;TD&gt;08Aug2012&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;08Aug2012&lt;/TD&gt;&lt;TD&gt;17Dec2012&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;21Aug2015&lt;/TD&gt;&lt;TD&gt;26Aug2015&lt;/TD&gt;&lt;TD&gt;PINE-B&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;26Aug2015&lt;/TD&gt;&lt;TD&gt;22Sep2015&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;22Sep2015&lt;/TD&gt;&lt;TD&gt;14Oct2015&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;14Oct2015&lt;/TD&gt;&lt;TD&gt;22Nov2015&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;22Nov2015&lt;/TD&gt;&lt;TD&gt;04Dec2015&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;04Dec2015&lt;/TD&gt;&lt;TD&gt;10Dec2015&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;10Dec2015&lt;/TD&gt;&lt;TD&gt;10Mar2016&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;10Mar2016&lt;/TD&gt;&lt;TD&gt;29Dec2017&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;29Dec2017&lt;/TD&gt;&lt;TD&gt;10May2018&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;10May2018&lt;/TD&gt;&lt;TD&gt;01Nov2018&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;01Nov2018&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;16Apr2017&lt;/TD&gt;&lt;TD&gt;17Apr2017&lt;/TD&gt;&lt;TD&gt;MAPLE-A&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;17Apr2017&lt;/TD&gt;&lt;TD&gt;08May2017&lt;/TD&gt;&lt;TD&gt;MAPLE-A&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;08May2017&lt;/TD&gt;&lt;TD&gt;22May2017&lt;/TD&gt;&lt;TD&gt;MAPLE-B&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;22May2017&lt;/TD&gt;&lt;TD&gt;31May2017&lt;/TD&gt;&lt;TD&gt;WILLOW&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;31May2017&lt;/TD&gt;&lt;TD&gt;08Jun2017&lt;/TD&gt;&lt;TD&gt;WILLOW&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;08Jun2017&lt;/TD&gt;&lt;TD&gt;02Aug2017&lt;/TD&gt;&lt;TD&gt;WILLOW&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;02Aug2017&lt;/TD&gt;&lt;TD&gt;07Aug2017&lt;/TD&gt;&lt;TD&gt;MAPLE-C&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;07Aug2017&lt;/TD&gt;&lt;TD&gt;11Sep2017&lt;/TD&gt;&lt;TD&gt;WILLOW&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;11Sep2017&lt;/TD&gt;&lt;TD&gt;03Apr2018&lt;/TD&gt;&lt;TD&gt;WILLOW&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;What I want my data to look like:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;FULLNAME&lt;/TD&gt;&lt;TD&gt;START&lt;/TD&gt;&lt;TD&gt;END&lt;/TD&gt;&lt;TD&gt;LOCATION&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;14Sep2010&lt;/TD&gt;&lt;TD&gt;27Oct2010&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;28Oct2010&lt;/TD&gt;&lt;TD&gt;17May2011&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;17May2011&lt;/TD&gt;&lt;TD&gt;13Dec2011&lt;/TD&gt;&lt;TD&gt;PINE-A&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;SMITH, JOHN&lt;/TD&gt;&lt;TD&gt;13Dec2011&lt;/TD&gt;&lt;TD&gt;17Dec2012&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;21Aug2015&lt;/TD&gt;&lt;TD&gt;26Aug2015&lt;/TD&gt;&lt;TD&gt;PINE-B&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;26Aug2015&lt;/TD&gt;&lt;TD&gt;10Dec2015&lt;/TD&gt;&lt;TD&gt;CEDAR&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;MILLER, SALLY&lt;/TD&gt;&lt;TD&gt;10Dec2015&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;OAK&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;16Apr2017&lt;/TD&gt;&lt;TD&gt;08May2017&lt;/TD&gt;&lt;TD&gt;MAPLE-A&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;08May2017&lt;/TD&gt;&lt;TD&gt;22May2017&lt;/TD&gt;&lt;TD&gt;MAPLE-B&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;22May2017&lt;/TD&gt;&lt;TD&gt;02Aug2017&lt;/TD&gt;&lt;TD&gt;WILLOW&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;JOHNSON, ROBERT&lt;/TD&gt;&lt;TD&gt;02Aug2017&lt;/TD&gt;&lt;TD&gt;07Aug2017&lt;/TD&gt;&lt;TD&gt;MAPLE-C&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;</description>
      <pubDate>Tue, 11 Aug 2020 18:33:07 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Merging-rows-based-on-date-and-location/m-p/675973#M19479</guid>
      <dc:creator>cbagdon-cox</dc:creator>
      <dc:date>2020-08-11T18:33:07Z</dc:date>
    </item>
    <item>
      <title>ERROR ##-###: The INDEX function call has too many arguments.</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/ERROR-The-INDEX-function-call-has-too-many-arguments/m-p/675045#M19469</link>
      <description>&lt;P&gt;I am working with data trying to identify opioid useage.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;data want;&lt;/P&gt;&lt;P&gt;set have;&lt;/P&gt;&lt;P&gt;IF INDEX(UPCASE(var1), "METHADONE")&amp;gt;0 or INDEX(UPCASE(var2), "METHADONE")&amp;gt;0 or&lt;BR /&gt;INDEX(UPCASE(var3), "METHADONE")&amp;gt;0 or INDEX(UPCASE(var4), "METHADONE")&amp;gt;0 or&lt;BR /&gt;INDEX(UPCASE(var5&lt;SPAN style="font-family: inherit;"&gt;), "METHADONE")&amp;gt;0 or INDEX(UPCASE(var6), "METHADONE")&amp;gt;0&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;THEN OPI_3=1;&lt;/P&gt;&lt;P&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This code works for 14 of the 16 substances I am looking for.&lt;/P&gt;&lt;P&gt;But when I am looking for OPANA and CODIENE it is popping up with:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ERROR ##-###: The INDEX function call has too many arguments.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I think this may be something about how the data is formatted but I haven't had to use INDEX before.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 06 Aug 2020 16:03:39 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/ERROR-The-INDEX-function-call-has-too-many-arguments/m-p/675045#M19469</guid>
      <dc:creator>DanielQuay</dc:creator>
      <dc:date>2020-08-06T16:03:39Z</dc:date>
    </item>
    <item>
      <title>How do I clear these errors?</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-do-I-clear-these-errors/m-p/673626#M19459</link>
      <description>&lt;P&gt;&lt;FONT size="3"&gt;Hello all&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT size="3"&gt;I have inherited SAS code that has approx. 20 steps and the code had errors throughout. I have been successful in cleaning it up until the last two steps. This is the code for the first one&lt;/FONT&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;PROC SQL;
CREATE TABLE WORK.AllClaims1 (DROP= Discharge Date Rename= (Discharge Date1= Discharge Date)) AS 
SELECT *
 ,INPUT(Discharge Date,5.) - 21916 AS Discharge Date1 FORMAT= mmddyy10.
FROM WORK.AllClaims
;QUIT;&lt;BR /&gt;&lt;BR /&gt;This is the error message I get:&lt;BR /&gt;&lt;BR /&gt;&lt;FONT color="#ff0000"&gt;22&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#ff0000"&gt;76&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#ff0000"&gt;ERROR 22-322: Syntax error, expecting one of the following: !, !!, &amp;amp;, (, *, **, +, -, '.', /, &amp;lt;, &amp;lt;=, &amp;lt;&amp;gt;, =, &amp;gt;, &amp;gt;=, ?, AND, CONTAINS, EQ, EQT, GE, GET, GT, GTT, LE, LET, LIKE, LT, LTT, NE, NET, OR, ^=, |, ||, ~=.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#ff0000"&gt;ERROR 76-322: Syntax error, statement will be ignored.&lt;/FONT&gt;&lt;BR /&gt;&lt;/CODE&gt;&lt;/PRE&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;The is the code for the other step&lt;/FONT&gt;&lt;FONT color="#000000"&gt;&amp;nbsp;&lt;/FONT&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;DATA AllClaims_Updated;
SET hphc_EOL_FY20 WORK.AllClaims;
RUN;&lt;BR /&gt;&lt;BR /&gt;This&amp;nbsp;the&amp;nbsp;error&amp;nbsp;message&amp;nbsp;that&amp;nbsp;I&amp;nbsp;get:&lt;BR /&gt;&lt;BR /&gt;&lt;FONT color="#ff0000"&gt;ERROR: Variable Age has been defined as both character and numeric.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#ff0000"&gt;ERROR: Variable BMT has been defined as both character and numeric.&lt;BR /&gt;&lt;/FONT&gt;&lt;BR /&gt;These are the two columns that are effected and their types.&lt;BR /&gt;&lt;BR /&gt;work.EOL_COHORT Age = Type Numeric, Length 8&lt;BR /&gt;BHM = Type Character, Length Character 1&lt;BR /&gt;&lt;BR /&gt;AllCohorts1 Age = Type Character, Length Character 1&lt;BR /&gt;BHM = Type Numeric, Length 8&lt;BR /&gt;&lt;/CODE&gt;&lt;/PRE&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;&lt;/CODE&gt;&lt;/PRE&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;Any assistance with these would be greatly appreciated.&lt;/P&gt;</description>
      <pubDate>Fri, 31 Jul 2020 06:43:35 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-do-I-clear-these-errors/m-p/673626#M19459</guid>
      <dc:creator>wheddingsjr</dc:creator>
      <dc:date>2020-07-31T06:43:35Z</dc:date>
    </item>
    <item>
      <title>sum one column and group by two others</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/sum-one-column-and-group-by-two-others/m-p/673584#M19457</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have a big dataset with 3 values that are of interest. &lt;STRONG&gt;ID&lt;/STRONG&gt;, &lt;STRONG&gt;Visite&lt;/STRONG&gt; and &lt;STRONG&gt;WSL_Flaechen&lt;/STRONG&gt;.&lt;/P&gt;&lt;P&gt;Visite ranges from 1 to 5 and WSL_Flaechen from 1 to 2.&lt;/P&gt;&lt;P&gt;With the following code i managed to group by Visite and count the WSL_Flaechen for each ID.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="asdf2.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/47814i4E3C9F427DC8C822/image-size/medium?v=1.0&amp;amp;px=400" title="asdf2.png" alt="asdf2.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="asdf.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/47815i9E22990280244C25/image-size/medium?v=1.0&amp;amp;px=400" title="asdf.png" alt="asdf.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;But instead i want count of WSL_Flaechen (blue arrow) to be multiplied with the value of WSL_Flaechen (red arrow) and display the sum of that for each ID.&lt;/P&gt;&lt;P&gt;So ID 4001 has&lt;/P&gt;&lt;P&gt;(1*8 + 2*12) = 32 on Visite 1 and&lt;/P&gt;&lt;P&gt;2 + 20 = 22 on Visite 2&lt;/P&gt;&lt;P&gt;and 26 on Visite 3&lt;/P&gt;&lt;P&gt;and so on...&lt;/P&gt;&lt;P&gt;The dataset only has 1s and 2s and the WSL_Flaechen field and i cant seem to just sum them up. I only display the count of 1s and 2s in seperate fields. How to i properly use the sum function here?&lt;/P&gt;&lt;P&gt;When i try to get sum in there i get the "statistical variable other than N without analysis variable" error.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Sincerely,&lt;/P&gt;&lt;P&gt;Keets&lt;/P&gt;</description>
      <pubDate>Thu, 30 Jul 2020 22:39:53 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/sum-one-column-and-group-by-two-others/m-p/673584#M19457</guid>
      <dc:creator>Keets</dc:creator>
      <dc:date>2020-07-30T22:39:53Z</dc:date>
    </item>
    <item>
      <title>/tmp Storage used for joining and sorting</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/tmp-Storage-used-for-joining-and-sorting/m-p/673064#M19449</link>
      <description>&lt;P&gt;Hi Team,&lt;/P&gt;&lt;P&gt;Few of my jobs are getting failed due to insufficient storage, my work file location having sufficient space. /tmp/ getting utilised by job for sorting and joining.&amp;nbsp;&lt;/P&gt;&lt;P&gt;my question is can we change the location of this sorting and joining location permentalty.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;-shrikant Suvarnkar&lt;/P&gt;</description>
      <pubDate>Wed, 29 Jul 2020 09:12:06 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/tmp-Storage-used-for-joining-and-sorting/m-p/673064#M19449</guid>
      <dc:creator>shrikantpwc</dc:creator>
      <dc:date>2020-07-29T09:12:06Z</dc:date>
    </item>
    <item>
      <title>Computing the change in values across time and state</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Computing-the-change-in-values-across-time-and-state/m-p/672375#M19441</link>
      <description>&lt;P&gt;Updated (7-27-20)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hello all,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I could use your help figuring out some code. I have different 'buckets' that take on different values depending on the time, bucketcode, and state, and I am trying to compute the change in values (count) across time and state. Please see an illustration of my data below.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;A couple new notes:&lt;/P&gt;&lt;P&gt;(1) I need it to accommodate multiple bucketcodes per state per time period.&lt;/P&gt;&lt;P&gt;(2)&amp;nbsp;I need to account for the appearance / disappearance of bucketcodes within a state. For example, a new bucketcode might enter OH at time period 103 that wasnt previously there in the prior time period (102). &lt;SPAN&gt;If a new bucketcode appears in&amp;nbsp;a time period (eg., 102) (that isn't the first) then that should be reflected in the&amp;nbsp;changescore differential for that bucketcode/state/timeperiod triplet with whatever&amp;nbsp;the addition in count was, eg., 1. If a bucket that was there in a time period disappears in a subsequent period (was there in 102, but not in a state-time-bucketcode triplet in&amp;nbsp;the next period, eg., 103) then the state-time-bucketcode triplet should reflect whatever&amp;nbsp;the drop in value is/was for that subsequent period. For example, VA-104-333 has a count of&amp;nbsp;2, but if VA-105-333 is not there then VA-105-333 would be -2.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;(3) Dropping the first time period is fine, but after that if a bucketcode enters a state and time period, that wasn't previously there, it should be treated as entering the dataset.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;data have;
input time Bucketcode State $ Count;
datalines; 
101 222 OH 3
102 222 OH 4
103 222 OH 4
103 333 OH 1
104 333 OH 1
101 111 TX 2
102 111 TX 1
103 111 TX 1
104 111 TX 2
101 222 VA 1
102 222 VA 2
103 222 VA 2
104 222 VA 4
104 444 VA 2
;

data want;
input Time Bucketcode State $ Countdiff;
datalines;
102 222 OH 1
103 222 OH 0
103 333 OH 1
104 222 OH -4
104 333 OH 0
102 111 TX -1
103 111 TX 0
104 111 TX 1
102 222 VA 1
103 222 VA 0
104 222 VA 2
104 444 VA 2
;


&lt;/PRE&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 30 Jul 2020 04:45:52 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Computing-the-change-in-values-across-time-and-state/m-p/672375#M19441</guid>
      <dc:creator>r4321</dc:creator>
      <dc:date>2020-07-30T04:45:52Z</dc:date>
    </item>
    <item>
      <title>Selects which branch to use - SAS DATA INTEGRATION</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Selects-which-branch-to-use-SAS-DATA-INTEGRATION/m-p/671396#M19440</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;I have branch structures at GIT.&lt;/P&gt;&lt;P&gt;Is there a way to control how SAS DI selects which branch to use? At the moment, all checkins end at our main branch (branch master), instead of the individual resource branch the developer is working on?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;best regards&lt;/P&gt;&lt;P&gt;Kassia&lt;/P&gt;</description>
      <pubDate>Wed, 22 Jul 2020 14:25:41 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Selects-which-branch-to-use-SAS-DATA-INTEGRATION/m-p/671396#M19440</guid>
      <dc:creator>kassialua1</dc:creator>
      <dc:date>2020-07-22T14:25:41Z</dc:date>
    </item>
    <item>
      <title>How do I import a sas7bdat file to SAS studio and then use it for data manipulation?</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-do-I-import-a-sas7bdat-file-to-SAS-studio-and-then-use-it/m-p/670810#M19435</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have 4 sas7bdat files , wanted to concatenate them for data manipulation . How can i import the same to sas university edition and perform concatenation ?&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jul 2020 20:48:12 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-do-I-import-a-sas7bdat-file-to-SAS-studio-and-then-use-it/m-p/670810#M19435</guid>
      <dc:creator>jjoy4891</dc:creator>
      <dc:date>2020-07-20T20:48:12Z</dc:date>
    </item>
    <item>
      <title>Transpose long to wide with Do Loop</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Transpose-long-to-wide-with-Do-Loop/m-p/669568#M19425</link>
      <description>&lt;P&gt;Hello!&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; I've spent a couple of days researching forums and reading papers and I'm stumped.&amp;nbsp; I don't know how to use arrays, macros, or do loops - and i think that's what's needed for this task.&amp;nbsp; This is more complicated than I can do.&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; This is fish habitat data.&amp;nbsp; The ORIGIN is the stream where the fish lives and the DESTINATION is another suitable stream. DISTANCE is the stream length between them, and the LEN and VOL are additional attributes.&amp;nbsp; The data must be sorted by ORIGIN, DISTANCE because I want the transformed columns to be in the order of the closest destination habitats.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; My dataset as &amp;gt;6000 obs, and some ORIGINs have 25 DESTINATIONS, so there will be about 100 columns in the resulting table.&amp;nbsp; It's what the boss wants - and I'm bummed that I can't figure it out myself.&lt;/P&gt;&lt;P&gt;&lt;FONT size="4"&gt;&lt;STRONG&gt;Here's the data:&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;data WORK.CLASS(label='Habitat Data');&lt;BR /&gt;infile datalines;&lt;BR /&gt;input Origin:8. Destination:8. Distance:8. Len:8. Vol:8.;&lt;BR /&gt;datalines;&lt;BR /&gt;1001 999 11891 26.44 193.00&lt;BR /&gt;1005 1016 2422 7.03 8.42&lt;BR /&gt;1005 944 19437 16.84 74.17&lt;BR /&gt;1010 1012 42986 22.10 262.95&lt;BR /&gt;1010 988 48093 9.10 26.36&lt;BR /&gt;1012 1010 42986 10.66 472.81&lt;BR /&gt;1012 988 44204 9.10 26.36&lt;BR /&gt;1012 999 47478 26.44 193.00&lt;BR /&gt;1016 1005 2422 30.73 62.22&lt;BR /&gt;;;;;&lt;/P&gt;&lt;P&gt;&lt;FONT size="4"&gt;&lt;STRONG&gt;Here's the result I want:&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT size="2"&gt;As I move the data into the columns, I have abbreviated their variable names.&amp;nbsp; I highlighted the DISTANCEs so you can see how they dictate the ordering of the columns.&lt;/FONT&gt;&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Origin&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Dest1&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Dist1&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Len1&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Vol1&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Dest2&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Dist2&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Len2&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Vol2&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Dest3&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Dist3&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Len3&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;Vol3&lt;/FONT&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;FONT size="2"&gt;1001&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;999&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;STRONG&gt;&lt;FONT size="2"&gt;11891&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;26.44&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;193.00&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;FONT size="2"&gt;1005&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;1016&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;STRONG&gt;&lt;FONT size="2"&gt;2422&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;7.03&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;8.42&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;944&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;STRONG&gt;&lt;FONT size="2"&gt;19437&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;16.84&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;74.17&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;FONT size="2"&gt;1010&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;1012&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;STRONG&gt;&lt;FONT size="2"&gt;42986&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;22.10&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;262.95&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;988&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;STRONG&gt;&lt;FONT size="2"&gt;48093&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;9.10&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;26.36&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;FONT size="2"&gt;1012&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;1010&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;STRONG&gt;&lt;FONT size="2"&gt;42986&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;10.66&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;472.81&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;988&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;STRONG&gt;&lt;FONT size="2"&gt;44204&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;9.10&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;26.36&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;999&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;STRONG&gt;&lt;FONT size="2"&gt;47478&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;26.44&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;193.00&lt;/FONT&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;FONT size="2"&gt;1016&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;1005&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;STRONG&gt;&lt;FONT size="2"&gt;2422&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;30.73&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&lt;FONT size="2"&gt;62.22&lt;/FONT&gt;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks a lot!&amp;nbsp; This has been fun, but a dead end.&amp;nbsp; Hope someone out there has fun with it!&amp;nbsp; I'm using SAS 9.4.&lt;/P&gt;</description>
      <pubDate>Wed, 15 Jul 2020 16:23:15 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Transpose-long-to-wide-with-Do-Loop/m-p/669568#M19425</guid>
      <dc:creator>camper</dc:creator>
      <dc:date>2020-07-15T16:23:15Z</dc:date>
    </item>
    <item>
      <title>Compute the number of observations and sum in a moving window</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Compute-the-number-of-observations-and-sum-in-a-moving-window/m-p/668471#M19417</link>
      <description />
      <pubDate>Fri, 10 Jul 2020 20:19:20 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Compute-the-number-of-observations-and-sum-in-a-moving-window/m-p/668471#M19417</guid>
      <dc:creator>Thomas_mp</dc:creator>
      <dc:date>2020-07-10T20:19:20Z</dc:date>
    </item>
    <item>
      <title>Export/Print bar graph along with table in HTML format using SAS Data integration</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Export-Print-bar-graph-along-with-table-in-HTML-format-using-SAS/m-p/667655#M19411</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am trying to print both table and bar graph in HTML but able to only get table in output using SAS DI.&lt;/P&gt;&lt;P&gt;Kindly assist me what modification is needed in my below code:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ods listing close;&lt;BR /&gt;ods html path='/ACTDATA/fireuw/' (url=none)&lt;BR /&gt;body='sgplot.html';&lt;BR /&gt;goptions reset=all;&lt;/P&gt;&lt;P&gt;PROC SQL;&lt;BR /&gt;create table CARS1 as&lt;BR /&gt;SELECT make, model, type, invoice, horsepower, length, weight&lt;BR /&gt;FROM&lt;BR /&gt;SASHELP.CARS&lt;BR /&gt;WHERE make in ('Audi','BMW')&lt;BR /&gt;;&lt;BR /&gt;RUN;&lt;BR /&gt;proc print;&lt;BR /&gt;quit;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;proc SGPLOT data = cars1;&lt;BR /&gt;vbar length /group = type GROUPDISPLAY = CLUSTER;&lt;BR /&gt;title 'Cluster of Cars by Types';&lt;BR /&gt;run;&lt;BR /&gt;quit;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;ods html close;&lt;BR /&gt;ods listing;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Akshay&amp;nbsp;&lt;SPAN style="font-family: inherit;"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jul 2020 09:00:37 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Export-Print-bar-graph-along-with-table-in-HTML-format-using-SAS/m-p/667655#M19411</guid>
      <dc:creator>AkshayS</dc:creator>
      <dc:date>2020-07-08T09:00:37Z</dc:date>
    </item>
    <item>
      <title>Column Header Validation</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Column-Header-Validation/m-p/667440#M19409</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm looking for some help on creating some form of custom transformation to check the column headers a .csv file has when it get read into DI Studio.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have a list of headers which should be in the table and when the file gets read in if it doesn't contain the correct headers, i want it will get moved to a check file so that the job doesn't fail. Any suggestions on the best way to set this up?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Tue, 07 Jul 2020 13:46:26 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Column-Header-Validation/m-p/667440#M19409</guid>
      <dc:creator>Flappyduck</dc:creator>
      <dc:date>2020-07-07T13:46:26Z</dc:date>
    </item>
    <item>
      <title>Rearranging data in SAS 9.4</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Rearranging-data-in-SAS-9-4/m-p/667035#M19403</link>
      <description>&lt;P&gt;Good morning,&lt;/P&gt;&lt;P&gt;I need some help rearranging data in a very large data set using SAS 9.4.&lt;/P&gt;&lt;P&gt;I have the data as follows (please see also attached Excel file) :&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;data&lt;/STRONG&gt; k;&lt;/P&gt;&lt;P&gt;input&amp;nbsp; date&amp;nbsp; va&amp;nbsp; vb&amp;nbsp; vc&amp;nbsp; vd&amp;nbsp; ;&lt;/P&gt;&lt;P&gt;cards&amp;nbsp; ;&lt;/P&gt;&lt;P&gt;20000630&amp;nbsp;&amp;nbsp;&amp;nbsp; 6.1207&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 8.7044&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 5.3283&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 5.846&lt;/P&gt;&lt;P&gt;20000929&amp;nbsp;&amp;nbsp;&amp;nbsp; 6.1158&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 8.6934&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 5.3253&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 5.8423&lt;/P&gt;&lt;P&gt;20001229&amp;nbsp;&amp;nbsp;&amp;nbsp; 5.7898&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 7.9828&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 5.1254&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 5.5832&lt;/P&gt;&lt;P&gt;20010330&amp;nbsp;&amp;nbsp;&amp;nbsp; 4.9815&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 7.5266&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 4.4934&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 4.5082&lt;/P&gt;&lt;P&gt;;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;run&lt;/STRONG&gt;;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I would like to have the data&amp;nbsp; as:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;va&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20000630&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;6.1207&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;va&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20000929&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;6.1158&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;va&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20001229&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5.7898&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;va&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20010330&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4.9815&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vb&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20000630&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;8.7044&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vb&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20000929&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;8.6934&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vb&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20001229&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;7.9828&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vb&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20010330&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;7.5266&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vc&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20000630&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5.3283&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vc&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20000929&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5.3253&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vc&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20001229&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5.1254&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vc&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20010330&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4.4934&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vd&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20000630&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5.846&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vd&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20000929&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5.8423&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vd&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20001229&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5.5832&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;vd&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20010330&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4.5082&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you for your help,&lt;/P&gt;&lt;P&gt;Tomas&lt;/P&gt;</description>
      <pubDate>Sun, 05 Jul 2020 18:47:01 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Rearranging-data-in-SAS-9-4/m-p/667035#M19403</guid>
      <dc:creator>Thomas_mp</dc:creator>
      <dc:date>2020-07-05T18:47:01Z</dc:date>
    </item>
    <item>
      <title>Quick datepart/date format query.</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Quick-datepart-date-format-query/m-p/666823#M19395</link>
      <description>&lt;P&gt;Hi, This might be fairly obvious but formats have always been my downfall and I'm up against the clock or I'd try experiment. I'm doing this in an expression on an Extract Transformation in DI Studio.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;CASE WHEN DOB  ^= .
THEN input(cats(year(datepart(DOB)),month(datepart(DOB))),6.)
ELSE . END&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;Gives me 20055, I'd like 200505 (dates like 200512 are already fine). The output is a straight numeric rather than a date so I didn't think it would be an issue, from a quick google I found using z as the format so tried z6. but that didn't work.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Any ideas? Thanks&lt;/P&gt;</description>
      <pubDate>Fri, 03 Jul 2020 14:02:35 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Quick-datepart-date-format-query/m-p/666823#M19395</guid>
      <dc:creator>MRDM</dc:creator>
      <dc:date>2020-07-03T14:02:35Z</dc:date>
    </item>
    <item>
      <title>Move dataset from one library to other</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Move-dataset-from-one-library-to-other/m-p/665892#M19380</link>
      <description>Is there a way that we can move the dataset from one library to other library? &lt;BR /&gt;In one of my DI job, I want to change the library of the source dataset. Any help? &lt;BR /&gt;&lt;BR /&gt;</description>
      <pubDate>Mon, 29 Jun 2020 18:57:00 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Move-dataset-from-one-library-to-other/m-p/665892#M19380</guid>
      <dc:creator>David_Billa</dc:creator>
      <dc:date>2020-06-29T18:57:00Z</dc:date>
    </item>
    <item>
      <title>Convert an excel data table into SAS format</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Convert-an-excel-data-table-into-SAS-format/m-p/665492#M19367</link>
      <description>&lt;P&gt;Hello there,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have an excel data set in the following format (sample data attached) for 5000 firms with 324 months of data.&lt;span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Capture.JPG" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/46722i2E6D8C71AD4A4DBB/image-size/medium?v=1.0&amp;amp;px=400" title="Capture.JPG" alt="Capture.JPG" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I need the following structure of rows and columns:&amp;nbsp;&lt;/P&gt;
&lt;P&gt;ID, Date, Y, X.&lt;/P&gt;
&lt;P&gt;it is assumed that the data is sorted or can be sorted by ID and DATE, such that within each ID block, the observations are in DATE order.&amp;nbsp;&lt;span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Capture1.JPG" style="width: 364px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/46723i01D0450F87740F4A/image-size/large?v=1.0&amp;amp;px=999" title="Capture1.JPG" alt="Capture1.JPG" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have created a dataset in excel manually (attached as DSET1) to match with this structure and able to import and run the macro on the sample but I am unable to find anything that can convert this big data structure as required in SAS.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Do I need to look in Excel or I can do it in SAS as well? I would appreciate it if I can get some guidance on this.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Regards,&lt;/P&gt;
&lt;P&gt;Sara&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 27 Jun 2020 02:54:56 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Convert-an-excel-data-table-into-SAS-format/m-p/665492#M19367</guid>
      <dc:creator>saraphdnz</dc:creator>
      <dc:date>2020-06-27T02:54:56Z</dc:date>
    </item>
    <item>
      <title>To identify the continuity and loyalty of customer over the years and find maximum continuity year</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/To-identify-the-continuity-and-loyalty-of-customer-over-the/m-p/665398#M19365</link>
      <description>&lt;P&gt;&lt;SPAN&gt;Hello Everyone,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;I am looking to create a sequence variable based on two columns : Cust_name and Year.&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;There are different type of customers as below:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer1 was with me from 2001 to 2003&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer2 was with me only for 2001 and 2002&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer3 was only for one year 2001&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer4 was with me alternate year 2003,2005,2007&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer5 2001,2002,2005,2006,2007&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;I want to find maximum number of years I was able to provide service to my customers.&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;So, Output I am looking is:&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer1 output as 3 (as he was 2001,2002,2003)&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer2 as 2 (as he was 2001,2002)&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer3 as 1 (as he was only for yr 2001)&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer4 as 1 (alternate yr so no continuous year)&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Customer5 as 3 ( he was there continuously 2(2001,2002) and 3(2005,2006,2007) so I will choose max as 3).&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;PFB dataset for your reference:&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;data abc;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;input customer_id Customer_name $ 3 - 11 transaction_date :mmddyy10. ;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;format transaction_date mmddyy10.;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;datalines;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;1 Customer1 03/31/2001&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;1 Customer1 03/31/2002&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;1 Customer1 03/31/2003&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;2 Customer2 03/31/2001&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;2 Customer2 03/31/2002&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;3 Customer3 03/31/2001&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;4 Customer4 03/31/2003&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;4 Customer4 03/31/2005&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;4 Customer4 03/31/2007&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;5 Customer5 03/31/2001&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;5 Customer5 03/31/2002&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;5 Customer5 03/31/2005&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;5 Customer5 03/31/2006&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;5 Customer5 03/31/2007&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;run;&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;proc print data=abc;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;run;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;Cust_Name&lt;/TD&gt;&lt;TD&gt;Year&lt;/TD&gt;&lt;TD&gt;Output1_Sequence&lt;/TD&gt;&lt;TD&gt;Final_output(max(Output1 by Cust_name))&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;2001&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;2002&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;2003&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2001&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2002&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;2001&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;2003&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;2005&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;2007&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;2001&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;2002&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;2005&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;2006&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;2007&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Looking forward for the insights.&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks in advance!&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 26 Jun 2020 16:08:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/To-identify-the-continuity-and-loyalty-of-customer-over-the/m-p/665398#M19365</guid>
      <dc:creator>AkshayS</dc:creator>
      <dc:date>2020-06-26T16:08:05Z</dc:date>
    </item>
    <item>
      <title>Error in "where clause" for date in PROC REG</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Error-in-quot-where-clause-quot-for-date-in-PROC-REG/m-p/665217#M19362</link>
      <description>&lt;P&gt;Hello there,&lt;/P&gt;
&lt;P&gt;I am facing an error issue while running the PROC REG. My dataset(attached) named DSET1 have the following structure of rows and columns: ID, DATE Y and X with the format: &lt;span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Capture1.JPG" style="width: 398px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/46679i24138CCEB35AE94E/image-size/medium?v=1.0&amp;amp;px=400" title="Capture1.JPG" alt="Capture1.JPG" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have inserted the code and the full log here. I need help to run the regression for my dataset that contains data from Jan1990 - Dec2019 for each ID.&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;/** Import an XLSX file.  **/

PROC IMPORT DATAFILE="H:\DSET1.xlsx\"
		    OUT=WORK.dset1
		    DBMS=XLSX
		    REPLACE;
RUN;

/** Print the results. **/

PROC PRINT DATA=WORK.dset1; RUN;


/** SortedIn ID &amp;amp; Date order**/
proc sort data=WORK.dset1 out=temp;
by id date;
run;


/** list of variables and attributes **/

Proc Contents data=work.temp;
run;

/** Estimate coefficients for each ID **/

proc reg noprint data=temp outest=regout1 edf ;
	where Date between ‘01JAN2020’d and ‘31DEC1989’d;
 	model y = x;
 	by id;
run; &lt;/CODE&gt;&lt;/PRE&gt;
&lt;PRE&gt; 
 1          OPTIONS NONOTES NOSTIMER NOSOURCE NOSYNTAXCHECK;
 72         
 73         /** Import an XLSX file.  **/
 74         
 75         PROC IMPORT DATAFILE="H:\Paper 3 SAS - Single Share Class\SAS MAcro help docs\DSET1.xlsx\"
 76             OUT=WORK.dset1
 77             DBMS=XLSX
 78             REPLACE;
 79         RUN;
 
 NOTE: One or more variables were converted because the data type is not supported by the V9 engine. 
       For more details, run with options MSGLEVEL=I.
 NOTE: The import data set has 1800 observations and 4 variables.
 NOTE: WORK.DSET1 data set was successfully created.
 NOTE: PROCEDURE IMPORT used (Total process time):
       real time           0.11 seconds
       cpu time            0.06 seconds
       

 
 80         
 81         /** Print the results. **/
 82         
 83         PROC PRINT DATA=WORK.dset1; RUN;
 
 NOTE: There were 1800 observations read from the data set WORK.DSET1.
 NOTE: PROCEDURE PRINT used (Total process time):
       real time           2.22 seconds
       cpu time            2.18 seconds
       

 
 84         
 85         /** SortedIn ID &amp;amp; Date order**/
 86         proc sort data=WORK.dset1 out=temp;
 87         by id date;
 88         run;
 
 NOTE: There were 1800 observations read from the data set WORK.DSET1.
 NOTE: The data set WORK.TEMP has 1800 observations and 4 variables.
 NOTE: PROCEDURE SORT used (Total process time):
       real time           0.00 seconds
       cpu time            0.01 seconds
       
 
 89         
 90         /** Estimate coefficients for each ID **/
 91         
 92         proc reg noprint data=temp outest=regout1 edf ;
 93         where Date between ‘01JAN2020’d and ‘31DEC1989’d;
                                _
                                22
                                76
 ERROR: Syntax error while parsing WHERE clause.
 ERROR 22-322: Syntax error, expecting one of the following: a name, a quoted string, 
               a numeric constant, a datetime constant, a missing value, (, +, -, INPUT, NOT, PUT, ^, 
               ~.  
 ERROR 76-322: Syntax error, statement will be ignored.
 94          model y = x;
 95          by id;
 96         run;
 
 WARNING: RUN statement ignored due to previous errors. Submit QUIT; to terminate the procedure.
 NOTE: PROCEDURE REG used (Total process time):
       real time           0.00 seconds
       cpu time            0.00 seconds
       
 NOTE: The SAS System stopped processing this step because of errors.
 WARNING: The data set WORK.REGOUT1 may be incomplete.  When this step was stopped there were 0 
          observations and 0 variables.
 WARNING: Data set WORK.REGOUT1 was not replaced because this step was stopped.
 97         
 98         OPTIONS NONOTES NOSTIMER NOSOURCE NOSYNTAXCHECK;
 111        


&lt;/PRE&gt;
&lt;P&gt;Regards,&lt;/P&gt;
&lt;P&gt;Sara&lt;/P&gt;</description>
      <pubDate>Fri, 26 Jun 2020 01:23:49 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Error-in-quot-where-clause-quot-for-date-in-PROC-REG/m-p/665217#M19362</guid>
      <dc:creator>saraphdnz</dc:creator>
      <dc:date>2020-06-26T01:23:49Z</dc:date>
    </item>
    <item>
      <title>Conditionally execute precode and postcode in SAS DI studio</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Conditionally-execute-precode-and-postcode-in-SAS-DI-studio/m-p/663734#M19342</link>
      <description>&lt;P&gt;I've the ETL Jobs in DI Studio where it has precode and postcode as below. Main purpose of this code to get the Job Status and&amp;nbsp;write in the SQL table.However I don't want this precode and postcode to execute when any of the ETL Job needs an update or enhancement. I don't wish to remove the precode and postcode either.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;SAS program for the macro %Job_status is called in autoexec file.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I'm looking for an way to control the precode and postcode macro execution by macro variable. E.g.&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;%let run_macro_pre_post_code=Y;&lt;/EM&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;when I need to run precode and postcode and when&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;EM&gt;%let run_macro_pre_post_code=N;&lt;/EM&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;then precode and postcode should not run.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;/*Precode*/
%job_status(_status=STARTED,                                      
                _jobName=&amp;amp;etls_jobName. 
               );  

/*Postcode*/
%job_status(_status=FINISHED,                                     
                _jobName=&amp;amp;etls_jobName., 
                _job_return_code=&amp;amp;syscc.
   ); &lt;/PRE&gt;
&lt;P&gt;&lt;SPAN&gt;I would be thankful if someone of you&amp;nbsp;help me with the logic or the program to accomplish it.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 20 Jun 2020 17:55:17 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Conditionally-execute-precode-and-postcode-in-SAS-DI-studio/m-p/663734#M19342</guid>
      <dc:creator>David_Billa</dc:creator>
      <dc:date>2020-06-20T17:55:17Z</dc:date>
    </item>
    <item>
      <title>SAS/ACCESS interface to Teradata cannot be loaded</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/SAS-ACCESS-interface-to-Teradata-cannot-be-loaded/m-p/663347#M19336</link>
      <description>&lt;P&gt;Hi all,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm trying to access Teradata through sas and got this error:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ERROR: The SAS/ACCESS Interface to Teradata cannot be loaded. The SASTRA code appendage could not be loaded.&lt;/P&gt;&lt;P&gt;ERROR: A Connection to the teradata DBMS is not currently supported, or is not installed at your site.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;performed&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;proc&lt;/STRONG&gt; &lt;STRONG&gt;product_status&lt;/STRONG&gt;&lt;SPAN&gt;;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;run&lt;/STRONG&gt;&lt;SPAN&gt;;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;output:&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;For SAS/ACCESS Interface to Teradata ...&lt;/P&gt;&lt;P&gt;&lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp;&amp;nbsp; &lt;/SPAN&gt;Custom version information: 9.4_M1&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;and&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;proc&lt;/STRONG&gt; &lt;STRONG&gt;setinit&lt;/STRONG&gt;&lt;SPAN&gt;;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;output:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;---SAS/ACCESS Interface to Teradata &lt;SPAN class="Apple-converted-space"&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;14JUL2020&lt;/P&gt;&lt;P&gt;I also check the binaries file .dll and I have both of them. Why do I still have this error then?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 18 Jun 2020 23:32:35 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/SAS-ACCESS-interface-to-Teradata-cannot-be-loaded/m-p/663347#M19336</guid>
      <dc:creator>knn1888</dc:creator>
      <dc:date>2020-06-18T23:32:35Z</dc:date>
    </item>
    <item>
      <title>Connecting to Databases using Data Flux ODBC Drivers</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Connecting-to-Databases-using-Data-Flux-ODBC-Drivers/m-p/656628#M19323</link>
      <description>&lt;P&gt;Hi&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am looking for advice how to connect to databases from Data Flux management Studio/Server using Data Flux ODBC Drivers. I could connect using drivers provided by applications like SQL Server or Teradata driver. I am trying to figure out what steps need to be taken to use Data Flux ODBC drivers.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have followed below documentation link, as per step 5 seems like I need to update on server side which about to connect using the driver.&lt;BR /&gt;&lt;A href="https://support.sas.com/documentation/onlinedoc/dfdmstudio/2.6/dmpdmsug/Content/dfU_T_DataConnODBC.html" target="_blank"&gt;https://support.sas.com/documentation/onlinedoc/dfdmstudio/2.6/dmpdmsug/Content/dfU_T_DataConnODBC.html&lt;/A&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;For test, I am using SQLSERVER Express. but I had connectivity issues.&lt;BR /&gt;&lt;BR /&gt;Could any one please guide me what steps to be taken at least for SQL Server connectivity?&lt;/P&gt;
&lt;P&gt;Thanks,&lt;/P&gt;
&lt;P&gt;Rama&lt;/P&gt;</description>
      <pubDate>Wed, 10 Jun 2020 23:54:55 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Connecting-to-Databases-using-Data-Flux-ODBC-Drivers/m-p/656628#M19323</guid>
      <dc:creator>Rama_V</dc:creator>
      <dc:date>2020-06-10T23:54:55Z</dc:date>
    </item>
    <item>
      <title>Transpose in DI Studio</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Transpose-in-DI-Studio/m-p/652956#M19316</link>
      <description>&lt;P&gt;Hi, I am trying to transpose daily data using UserWrittenCode in DI Studio&lt;/P&gt;
&lt;P&gt;which I will append each day to a master table.&lt;/P&gt;
&lt;P&gt;HAVE (data):&lt;/P&gt;
&lt;P&gt;DATE &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; STATUS&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; COUNT_1&lt;/P&gt;
&lt;P&gt;06/01/2020&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Done&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 4&lt;/P&gt;
&lt;P&gt;06/01/2020&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Start &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 2&lt;/P&gt;
&lt;P&gt;06/01/2020&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; IP &amp;nbsp; &amp;nbsp;&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/P&gt;
&lt;P&gt;06/01/2020&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; FAIL &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 7&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;WANT (test1):&lt;/P&gt;
&lt;P&gt;Date&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Done&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Start&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; IP&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; FAIL&lt;/P&gt;
&lt;P&gt;06/01/2020&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 4&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 2&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 7&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Can you see an error in this code? Thanks!&lt;/P&gt;
&lt;P&gt;%let etls_tableExists = %eval(%sysfunc(exist(&amp;amp;_OUTPUT, DATA)));&lt;BR /&gt;%if (&amp;amp;etls_tableExists) %then %do;&lt;BR /&gt;proc sql noprint;&lt;BR /&gt;drop table &amp;amp;_OUTPUT.;&lt;BR /&gt;quit;&lt;BR /&gt;%end;&lt;/P&gt;
&lt;P&gt;&lt;FONT color="#0000FF"&gt;PROC TRANSPOSE data=&amp;amp;_INPUT out=test1;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;by DATE;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;id STATUS;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;var COUNT_1;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT color="#0000FF"&gt;RUN;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&lt;FONT color="#000000"&gt;DATA &amp;amp;_OUTPUT. ;&lt;BR /&gt;SET test1;&lt;BR /&gt;run;&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 10 Jun 2020 12:54:59 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Transpose-in-DI-Studio/m-p/652956#M19316</guid>
      <dc:creator>crawfe</dc:creator>
      <dc:date>2020-06-10T12:54:59Z</dc:date>
    </item>
    <item>
      <title>Sas Viya Data Explorer</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Sas-Viya-Data-Explorer/m-p/654037#M19314</link>
      <description>Does anyone no what server configuration files sas viya data explorer use? Apparently it does not read the same files as studio. Data explorer is unable to find my ODBC drivers while studio connects. It is a new install and im running a linux environment</description>
      <pubDate>Sun, 07 Jun 2020 16:47:50 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Sas-Viya-Data-Explorer/m-p/654037#M19314</guid>
      <dc:creator>kcoley</dc:creator>
      <dc:date>2020-06-07T16:47:50Z</dc:date>
    </item>
    <item>
      <title>How can I condense variables into one row?</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-can-I-condense-variables-into-one-row/m-p/653816#M19306</link>
      <description>&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="sas.png" style="width: 999px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/40565i9508BF25D1240250/image-size/large?v=1.0&amp;amp;px=999" title="sas.png" alt="sas.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Here is what I have to do:&lt;/P&gt;&lt;P&gt;Using the “schools1” data, create a new dataset called "schools2" which includes&lt;BR /&gt;average and median sat scores for sat1, sat2, sat3, and overall satavg with school as the&lt;BR /&gt;level of analysis. If you do this correctly, there should be 30 observations and 11&lt;BR /&gt;variables (including 8 different SAT variables) in your resulting dataset (use PROC&lt;BR /&gt;MEANS to do this).&lt;/P&gt;&lt;P&gt;So I think I have to basically get rid of the student variable and just have one row per school, but im really not sure how to do that using proc means, I've been trying for a couple hours but don't even really know where to start.&lt;/P&gt;&lt;P&gt;I added some of the code to make the data and a picture of the schools1 dataset.&lt;/P&gt;</description>
      <pubDate>Fri, 05 Jun 2020 22:48:54 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-can-I-condense-variables-into-one-row/m-p/653816#M19306</guid>
      <dc:creator>Ross123123</dc:creator>
      <dc:date>2020-06-05T22:48:54Z</dc:date>
    </item>
    <item>
      <title>Data is both character and numeric for large number of variables</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Data-is-both-character-and-numeric-for-large-number-of-variables/m-p/653798#M19300</link>
      <description>&lt;P&gt;Hi all, it's my first time posting here.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have 8 datasets each with 800+ variables that I am trying to stack. I am getting "ERROR: Variable X has been defined as both character and numeric."&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I know that this error comes from some of the datasets defining this variable as character and some as numeric.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However, with such a large number of variables (the error occurs on at least 400 of them) it's impractical to switch each variable using put/input.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am wondering if anyone has a solution that fixes a large number of these errors relatively quickly.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you!!&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 05 Jun 2020 20:35:02 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Data-is-both-character-and-numeric-for-large-number-of-variables/m-p/653798#M19300</guid>
      <dc:creator>Betsy</dc:creator>
      <dc:date>2020-06-05T20:35:02Z</dc:date>
    </item>
    <item>
      <title>Unable to understand the SAS code node working inside a process flow</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Unable-to-understand-the-SAS-code-node-working-inside-a-process/m-p/653487#M19294</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;I am unable to access the worktables which has my data for profiling through SAS code node.&lt;/P&gt;&lt;P&gt;I have added that table through setting of source bidding.&lt;/P&gt;&lt;P&gt;So my data step contains a file reader and work table writer.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShikhaAgarwal_0-1591318947167.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/40434i2324D30CE03B7F23/image-size/medium?v=1.0&amp;amp;px=400" title="ShikhaAgarwal_0-1591318947167.png" alt="ShikhaAgarwal_0-1591318947167.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;i reference this in process flow as i need to execute some SAS codes.&lt;/P&gt;&lt;P&gt;But SAS code sas the work table doesnt exits&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShikhaAgarwal_0-1591323372705.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/40450iA10B2D58FA234CBC/image-size/medium?v=1.0&amp;amp;px=400" title="ShikhaAgarwal_0-1591323372705.png" alt="ShikhaAgarwal_0-1591323372705.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ShikhaAgarwal_1-1591323392675.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/40451i77BA00FA4B74AF8D/image-size/medium?v=1.0&amp;amp;px=400" title="ShikhaAgarwal_1-1591323392675.png" alt="ShikhaAgarwal_1-1591323392675.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;I cant see the work table as active tab while filling the source binding details in input.&lt;/P&gt;&lt;P&gt;Where can i see my SAS libraries in dataflux?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 05 Jun 2020 02:18:15 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Unable-to-understand-the-SAS-code-node-working-inside-a-process/m-p/653487#M19294</guid>
      <dc:creator>ShikhaAgarwal</dc:creator>
      <dc:date>2020-06-05T02:18:15Z</dc:date>
    </item>
    <item>
      <title>Do loop</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Do-loop/m-p/651715#M19278</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;I'm trying to generate a variable (named "NEW") that is set to i=1 when first.VAR is observed and then is incremented by one until last.VAR.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;But i'd like that each time a new first.VAR is met, NEW is set to&amp;nbsp; i+1.&lt;/P&gt;
&lt;P&gt;And as a last requirement, i'd like that when NEW=12, then NEW+1 is set back to 1.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I tried something like this (In the attached table VAR is named PUIS_CLASS) :&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data WAANT;
Set HAVE;
Retain NEW;
	Do i=1 to 12;
	By PUIS_CLASS;
	if first.PUIS_CLASS then NEW=i;
	else NEW=i+1;
	if last.PUIS_CLASS then output;
	end;
run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;But it's not the solution...&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you again for your help &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 29 May 2020 13:42:01 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Do-loop/m-p/651715#M19278</guid>
      <dc:creator>Mathis1</dc:creator>
      <dc:date>2020-05-29T13:42:01Z</dc:date>
    </item>
    <item>
      <title>Looking to reorganize data around time based events</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Looking-to-reorganize-data-around-time-based-events/m-p/651618#M19274</link>
      <description>&lt;P&gt;Hello all,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have data on companies observed each year 2000-2015 (16 years) where I am looking to reorganize it based on a variable. This variable is an event-based variable that takes on values from 0-1 (1 if it happened). If (and/or when) the variable takes on a 1, I would like to collapse the rows and sum the values on other relevant variables. I would also like to create different indicators based on this: (1) one variable indicates when time starts (2) one variable that indicates when the time ends, and (3) a variable that indicates 0/1 whether an event took place at all. Essentially, each observation of a case becomes a start-stop as to when the event-based variable was observed.&lt;/P&gt;&lt;P&gt;I Illustrate below what I have and then want.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Have:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;Time&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ID&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;County&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;FocalEvent&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;X1mva&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;X2police&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;15&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;15&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;7&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;15&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;15&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;7&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;15&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;15&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;7&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Want:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;ID&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;County&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;X1mva&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;X2police&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;Time1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;Time2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;Event&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;15&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;27&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;34&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;34&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;34&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;15&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;40&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;40&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;25&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;25&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;21&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;21&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;25&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;25&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for your help.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;R&lt;/P&gt;</description>
      <pubDate>Fri, 29 May 2020 02:52:21 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Looking-to-reorganize-data-around-time-based-events/m-p/651618#M19274</guid>
      <dc:creator>r4321</dc:creator>
      <dc:date>2020-05-29T02:52:21Z</dc:date>
    </item>
    <item>
      <title>Average Difference between two date columns</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Average-Difference-between-two-date-columns/m-p/651560#M19270</link>
      <description>&lt;P&gt;Hi everyone,&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am a first time user of sas enterprise guide and I am trying to complete a simple calculation using this system. I have two date columns called "start date" and "end date". The formats of the date columns are in, for example, "08JUN2019". I calculated the difference between the two columns in order to produce the column "difference" which gives me the number of days that have elapsed. How would I calculate the average of the subtraction between "end date" and "start date" in sas enterprise guide?&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you in advance,&lt;/P&gt;</description>
      <pubDate>Thu, 28 May 2020 21:39:26 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Average-Difference-between-two-date-columns/m-p/651560#M19270</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-05-28T21:39:26Z</dc:date>
    </item>
    <item>
      <title>Converting R to SAS Code.</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Converting-R-to-SAS-Code/m-p/651484#M19267</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am trying to run correlation analysis in SAS that was previously done in R. I need to subset the dataset before the analysis, and I am having trouble understanding the R code for subsetting the dataset.&amp;nbsp;Could anyone please help me to interpret the code and suggest equivalent code in SAS?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have a knowledge of Base SAS and very little knowledge of R. Also, I am not very good at DO loop.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I need to reproduce the following result from R to SAS:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;for(i in unique(data$Site)){&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;sum1_alternate &amp;lt;- sum1 &amp;lt;- sum2 &amp;lt;- sum3_alternate &amp;lt;- 0&lt;BR /&gt;for(j in unique(data[data$Site==i,]$id)){&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;if(unique(data[data$site==i &amp;amp; data$id==j,]$ind)==1){&lt;BR /&gt;ind1_alternate &amp;lt;- ifelse(j %in% temp$id, 1, 0)&amp;nbsp;&lt;BR /&gt;sum1_alternate&amp;lt;- sum1_alternate + ind1_alternate&lt;BR /&gt;&lt;BR /&gt;if(ind1_alternate == 1){&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;ind1 &amp;lt;- ifelse(min(temp[temp$id==j,]$age)&amp;lt;21, 1, 0)&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;sum1 &amp;lt;- sum1+ind1&lt;BR /&gt;&lt;BR /&gt;ind2 &amp;lt;- ifelse(min(temp[temp$id==j,]$age)&amp;gt;21, 1, 0)&amp;nbsp;&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;sum2 &amp;lt;- sum2+ind2&lt;BR /&gt;}&lt;BR /&gt;}&lt;BR /&gt;i&lt;/P&gt;&lt;P&gt;f(unique(data[data$site==i &amp;amp; data$id==j,]$ind)==0){&lt;BR /&gt;sum3_alternate &amp;lt;- sum3_alternate+1&lt;BR /&gt;#sum_info &amp;lt;- summary(data[data$site==i &amp;amp; data$id==j,]$age)&lt;BR /&gt;}&lt;BR /&gt;}&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;sum1_alternate_site &amp;lt;- c(sum1_alternate_site, sum1_alternate)&lt;BR /&gt;sum1_site &amp;lt;- c(sum1_site, sum1)&lt;BR /&gt;sum2_site &amp;lt;- c(sum2_site, sum2)&lt;BR /&gt;sum3_alternate_site &amp;lt;- c(sum3_alternate_site, sum3_alternate)&lt;BR /&gt;#sum3_site &amp;lt;- c(sum3_site, sum3)&lt;BR /&gt;}&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;percent1 &amp;lt;- sum1_alternate_site/(sum1_alternate_site+sum3_alternate_site)&amp;nbsp;&lt;BR /&gt;percent2 &amp;lt;- sum1_site/(sum1_site+sum2_site+sum3_alternate_site)&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;plot(site_mean$x, percent1, xlab="Mean Age", ylab="Percentage of surgery", main="mean age vs percentage of surgery")&lt;/P&gt;&lt;P&gt;abline(lm(percent1 ~ site_mean$x))&lt;/P&gt;&lt;P&gt;cor.test(percent1, site_mean$x)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;plot(site_mean$x, percent2, xlab="Mean Age", ylab="Percentage of surgery", main="mean age vs percentage of surgery &amp;lt;21 year-old")&lt;/P&gt;&lt;P&gt;abline(lm(percent2 ~ site_mean$x))&lt;BR /&gt;cor.test(percent2, site_mean$x)&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 28 May 2020 17:24:33 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Converting-R-to-SAS-Code/m-p/651484#M19267</guid>
      <dc:creator>sandyzman1</dc:creator>
      <dc:date>2020-05-28T17:24:33Z</dc:date>
    </item>
    <item>
      <title>Concatenate values by other values</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Concatenate-values-by-other-values/m-p/651426#M19257</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;Please find attached the table that will illustrate my problem. I'd like that for each value of my variable PUIS_CLASS, I have the variable NEW which is the concatenation of all the values of&amp;nbsp;GROUPEACT occuring when the&amp;nbsp; PUIS_CLASS value is observed.&lt;BR /&gt;&lt;BR /&gt;For instance, the two first lines would be :&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;PUISS_CLASS&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;GROUPEACT&lt;/P&gt;
&lt;P&gt;1_K&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "20_24_25_26_27_28"&lt;/P&gt;
&lt;P&gt;2_L&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;"24_25_27"&amp;nbsp;&lt;/P&gt;
&lt;P&gt;etc...&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you for your help &lt;span class="lia-unicode-emoji" title=":grinning_face_with_smiling_eyes:"&gt;😄&lt;/span&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 28 May 2020 15:30:24 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Concatenate-values-by-other-values/m-p/651426#M19257</guid>
      <dc:creator>Mathis1</dc:creator>
      <dc:date>2020-05-28T15:30:24Z</dc:date>
    </item>
    <item>
      <title>How to practice SAS Data Integration Studio online (SAS Academy courses)?</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-practice-SAS-Data-Integration-Studio-online-SAS-Academy/m-p/651292#M19255</link>
      <description>&lt;P&gt;Hi, very basic question.&amp;nbsp; I am following the SAS Academy courses for Data Science.&lt;/P&gt;&lt;P&gt;In the Data Curation part, SAS Data Integration Studio is used.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I was wondering whether there is a possibility to practice online.&lt;/P&gt;&lt;P&gt;For example, in the Coursera SAS course one cat get access to the Viya platform and practice there.&lt;/P&gt;&lt;P&gt;Is there anything of the kind for the SAS Academy courses?&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 28 May 2020 05:57:47 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-practice-SAS-Data-Integration-Studio-online-SAS-Academy/m-p/651292#M19255</guid>
      <dc:creator>andreazanetti</dc:creator>
      <dc:date>2020-05-28T05:57:47Z</dc:date>
    </item>
    <item>
      <title>Looking to create indicator variables that help me organize a matched sample</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Looking-to-create-indicator-variables-that-help-me-organize-a/m-p/651229#M19254</link>
      <description>&lt;P&gt;Hello all,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Updating the description of the code I am hoping to get help with:&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have a dataset where I am hoping to indicator variables that help me better organize a matched sample from my data. I have data on counties observed each week over the course of a year, some of them experienced a focal event (i.e., if P =1) and some did not and will serve as controls. Additionally, some counties experienced the focal event more than once. I want to add an indicator variable that identifies before and after these focal events at different durations for both the treatments and the non-treated counties. I&amp;nbsp;need at 1 week, 2 weeks, 3 weeks, 4 weeks, and 5 weeks, but only illustrated up to 2 weeks in the example below.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In addition, would be ideal if once a county experienced a focal event they can no longer be in the control set from there forward, but it’s okay if they stay in the dataset and experience a focal event again.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I believe I need some variables such as an &lt;U&gt;eventweek variable&lt;/U&gt;&lt;SPAN&gt;&amp;nbsp;that identifies the week of the focal event, a&amp;nbsp;&lt;/SPAN&gt;&lt;U&gt;before/after variable&lt;/U&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;(0/1 if before after the focal event), a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;U&gt;duration variable&lt;/U&gt;&lt;SPAN&gt;&amp;nbsp;that identifies the number of weeks before and after&lt;/SPAN&gt;&amp;nbsp;(i.e., at 1, 2, 3, 4, and 5), and a variable (treated) that indicates that a county has been treated (after it has been treated since a county should be able to stay in the pool of controls to serve as a control until it gets treated if at all).&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Essentially, I think I need to duplicate sets of observations to create windows of treated and non-treated counties around the focal events I am interested in examining. I also like to add a few other variables: (1) a variable that indicates which treatment number this is for a county (i.e., first treatment, second treatment, …) (2) a variable that indicates whether a county was treated within a short time window of a previous time it was treated (=1 if treated within &amp;lt;=5 time periods, 0 otherwise) and (3) a variable that indicates the degree of treatment (I want focal event variable to be binary in the wantdata but in the have data it is a actually &amp;gt;1 sometimes since it is a count variable), so if focal event was 2, for example, this variable would = 2. See below for example.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Also, I should note that is a focal occurred before week 6, the durations for those observations may not be able to get up to the full 5 weeks before and after. For example, if a focal event occurred in a county in week 2, I'll only have the 1 week duration for that focal event.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Have:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;Time&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ID&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;County&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;FocalEvent&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;7&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;11&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;12&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;13&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;7&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;11&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;12&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;13&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;7&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;11&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;12&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;13&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;7&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Want:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;Time&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ID&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;County&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;FocalEvent&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;EventWeek&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;Duration&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BeforeAfter&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;Treated&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;Tnumber&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;Trecent&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;Tdegree&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;222&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;ABC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;4&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;555&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;DDD&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;11&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;444&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;CCC&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;11&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;333&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;BBB&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;0&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 04 Jun 2020 04:26:32 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Looking-to-create-indicator-variables-that-help-me-organize-a/m-p/651229#M19254</guid>
      <dc:creator>r4321</dc:creator>
      <dc:date>2020-06-04T04:26:32Z</dc:date>
    </item>
    <item>
      <title>SAS Data Integration Studio will always return &amp;JOB_RC = 0 even where there is Warning and Error</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/SAS-Data-Integration-Studio-will-always-return-amp-JOB-RC-0-even/m-p/650961#M19252</link>
      <description>&lt;P&gt;I am using Status Handling tab in SAS DI Studio to handle different for Job with Success, Warning, and Error status.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I will then pass this status to a autocall macro script to update to a table accordingly. By right, when job success (JOB_RC = 0), it will pass in the value of success to my macro script via autocall. I did the same for warning and error. I also put&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;%print Job_rc is &amp;amp;JOB_RC;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;in the status handling to see what is the job_rc as every job_rc indicates different job status.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I tried with success run, run with warning, and error, all returned &amp;amp;job_rc = 0.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Now I am confused. Why is this happening? SAS DI Studio first assign job_rc =0 as standard in the job but there is no assignment again when the job ends? I am sure there must be a way for assigning the right job_rc value so that we can make use of this value under "Status Handling".&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 27 May 2020 02:19:00 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/SAS-Data-Integration-Studio-will-always-return-amp-JOB-RC-0-even/m-p/650961#M19252</guid>
      <dc:creator>WorkingMan</dc:creator>
      <dc:date>2020-05-27T02:19:00Z</dc:date>
    </item>
    <item>
      <title>Import asc file question: Why does it work with a file reference, but not with (the same!) path?</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Import-asc-file-question-Why-does-it-work-with-a-file-reference/m-p/649824#M19245</link>
      <description>&lt;P&gt;Hi!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm having trouble to understand why this does work:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;FILENAME asc0808 FILESRVC&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; FOLDERPATH='/07. Team/02.Development/Case1'&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; FILENAME='2019.08.08_mutations.asc';&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;data a1;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; infile asc0808 delimiter = '|' dsd lrecl=1000 firstobs=1;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; informat onbewerkt $1000.;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; format onbewerkt $1000.;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; input onbewerkt;&lt;/P&gt;&lt;P&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;And this does not&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;data a1;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; infile '/07. Team/02.Development/Case1/2019.08.08_mutations.asc' delimiter = '|' dsd lrecl=1000 firstobs=1;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; informat onbewerkt $1000.;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; format onbewerkt $1000.;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; input onbewerkt;&lt;/P&gt;&lt;P&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The error message is: Physical file does not exist,&amp;nbsp;'/07. Team/02.Development/Case1/2019.08.08_mutations.asc'&lt;/P&gt;&lt;P&gt;This is the exact location I see when I look at the properties of the file reference of asc0808&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Can anybody explain? To me I'm doing exactly the same, but apparently not...&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
      <pubDate>Fri, 22 May 2020 10:07:23 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Import-asc-file-question-Why-does-it-work-with-a-file-reference/m-p/649824#M19245</guid>
      <dc:creator>AnnaBaukje</dc:creator>
      <dc:date>2020-05-22T10:07:23Z</dc:date>
    </item>
    <item>
      <title>SAS EG Data Export Fails</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/SAS-EG-Data-Export-Fails/m-p/649597#M19242</link>
      <description>&lt;P&gt;A co-worker is having a new problem with SAS EG. After successfully importing an Excel file, when using the Export tab in hopes of saving the data to an external SAS data set, the SAS EG interface does nothing. The usual Save As dialog box never opens. The Windows blue circle icon spins momentarily and then nothing happens.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any ideas?&lt;/P&gt;</description>
      <pubDate>Thu, 21 May 2020 14:58:54 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/SAS-EG-Data-Export-Fails/m-p/649597#M19242</guid>
      <dc:creator>CurtisSmithDCAA</dc:creator>
      <dc:date>2020-05-21T14:58:54Z</dc:date>
    </item>
    <item>
      <title>identify IDs who have received both medications</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/identify-IDs-who-have-received-both-medications/m-p/649568#M19235</link>
      <description>&lt;P&gt;Hello Everyone,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have a dataset for multiple records/ patient. Each record indicates a separate clinic visit for a certain patient identified by a unique id.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;patient.&amp;nbsp; &amp;nbsp; &amp;nbsp;unique id&lt;/P&gt;
&lt;P&gt;patient 1-&amp;nbsp; abcd&lt;/P&gt;
&lt;P&gt;patient 2-&amp;nbsp; efgh&lt;/P&gt;
&lt;P&gt;patient 3.&amp;nbsp; &amp;nbsp;ijkl&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;the data looks as follows&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;unique id&amp;nbsp; &amp;nbsp;clinic visit.&amp;nbsp; Aspirin&amp;nbsp; &amp;nbsp;statin&lt;/P&gt;
&lt;P&gt;abcd&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;1&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;1&lt;/P&gt;
&lt;P&gt;abcd&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;2&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;1&lt;/P&gt;
&lt;P&gt;abcd.&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 3&lt;/P&gt;
&lt;P&gt;abcd&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;4&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 1&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;efgh&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 1&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 1&lt;/P&gt;
&lt;P&gt;efgh.&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;2&lt;/P&gt;
&lt;P&gt;efgh.&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 3&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;1&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;ijkl&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 1&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 1.&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 1&lt;/P&gt;
&lt;P&gt;ijkl&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 2.&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;ijkl&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 3&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 1&lt;/P&gt;
&lt;P&gt;ijkl.&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;4&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;So looking at this, patients were either prescribed aspirin and statin on the same day (patient ijkl) or were only prescribe statin (patient efgh) or were prescribed both aspirin and statin but at different visits (patient abcd). My goal is to identify&amp;nbsp;&lt;SPAN style="font-family: inherit;"&gt;patients who have been prescribed aspirin and statin regardless of the visit. For clarification, I would like to identify patients abcd and ijkl in this instance. My understanding is that if I use the "if aspirin=1 AND statin=1 then var1=1..." command it will look across each record individually and not consider that those records belong to a single patient.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN style="font-family: inherit;"&gt;Happy to provide more details&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN style="font-family: inherit;"&gt;appreciate your help&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 21 May 2020 13:59:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/identify-IDs-who-have-received-both-medications/m-p/649568#M19235</guid>
      <dc:creator>GreenTree1</dc:creator>
      <dc:date>2020-05-21T13:59:05Z</dc:date>
    </item>
    <item>
      <title>Can write out ó or ø but it's read in as Ã³ or Ã¸</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Can-write-out-%C3%B3-or-%C3%B8-but-it-s-read-in-as-%C3%83-or-%C3%83/m-p/649302#M19231</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;I'm writing out a delimited file and one of the fields contains the value&amp;nbsp;ó, however when I read the file back in it's being read in as&amp;nbsp;Ã³, I've also tried ø which comes in as&amp;nbsp;Ã¸ any ideas? Simplified code:&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data &amp;amp;_OUTPUT.;
	length UNIQUE_REFERENCE $250.;
infile "&amp;amp;filename." dlm=',' dsd eov=eov truncover firstobs=2;
input UNIQUE_REFERENCE $;
eov=0;
run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;Data is basically&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;ID56453óHELPME&lt;/PRE&gt;
&lt;P&gt;It's reading everything in correctly except that character.&lt;BR /&gt;Essentially my issue is that I need to concatenate two ID fields into the&amp;nbsp;UNIQUE_REFERENCE field,&amp;nbsp;it's&amp;nbsp;then&amp;nbsp;sent&amp;nbsp;off&amp;nbsp;to another&amp;nbsp;system&amp;nbsp;(with&amp;nbsp;loads&amp;nbsp;of&amp;nbsp;other&amp;nbsp;fields)&amp;nbsp;that&amp;nbsp;does&amp;nbsp;some&amp;nbsp;stuff&amp;nbsp;then&amp;nbsp;sends&amp;nbsp;me&amp;nbsp;it&amp;nbsp;back,&amp;nbsp;I&amp;nbsp;then&amp;nbsp;split&amp;nbsp;my&amp;nbsp;UNIQUE_REFERENCE back into my two ID fields. The lengths can vary so I can't use a scan or anything like that, the other system also strips out any special characters which limits my option for a delimiter. I also can't use anything common that might appear in a user entered ID. That leaves me with limited options,&amp;nbsp;&amp;nbsp;ó and&amp;nbsp;ø is an example of a character that it can pass back to me, unfortunately it seems SAS can happily write that out to a file, but reading it in it is converting it to&amp;nbsp;Ã³ and Ã¸ using the code above, any ideas? Thanks&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 20 May 2020 17:08:20 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Can-write-out-%C3%B3-or-%C3%B8-but-it-s-read-in-as-%C3%83-or-%C3%83/m-p/649302#M19231</guid>
      <dc:creator>MRDM</dc:creator>
      <dc:date>2020-05-20T17:08:20Z</dc:date>
    </item>
    <item>
      <title>Convert a multiple length character price variable to numeric</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Convert-a-multiple-length-character-price-variable-to-numeric/m-p/649131#M19226</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;I have a character variable under this form :&lt;/P&gt;
&lt;P&gt;61.39€&lt;/P&gt;
&lt;P&gt;47.32€&lt;/P&gt;
&lt;P&gt;124.45€&lt;/P&gt;
&lt;P&gt;12.3€&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;and i'd like to convert it into a numeric variable under that form :&lt;/P&gt;
&lt;P&gt;61.39&lt;/P&gt;
&lt;P&gt;47.32&lt;/P&gt;
&lt;P&gt;124.45&lt;/P&gt;
&lt;P&gt;12.3&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I tried this code :&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;Data WANT;
Set HAVE;
NEW=input(strip(scan(VAR,-1,"€")), 5);
run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;but i don't get what i want. I only get the price with three digits before the comma. Moreover those prices are only displayed with one decimal instead of 2.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;thank you for your help &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 20 May 2020 09:19:26 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Convert-a-multiple-length-character-price-variable-to-numeric/m-p/649131#M19226</guid>
      <dc:creator>Mathis1</dc:creator>
      <dc:date>2020-05-20T09:19:26Z</dc:date>
    </item>
    <item>
      <title>How to create a variable based on another variables per ID</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-create-a-variable-based-on-another-variables-per-ID/m-p/648500#M19216</link>
      <description>&lt;P&gt;I have a data set containing ID, Visit Type, and Date.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE border="0" cellspacing="0" cellpadding="0"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;ID&lt;/TD&gt;&lt;TD&gt;Visit&lt;/TD&gt;&lt;TD&gt;Date&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060505&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20060506&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20061217&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060301&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060305&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20070503&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20070506&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20100505&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20061112&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20080103&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20081012&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20081227&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20050325&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20050412&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20070510&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20061010&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20061231&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20070125&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;6&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060718&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;6&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060817&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;6&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20070918&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Now based on this, I want to define "Index Date."&lt;/P&gt;&lt;P&gt;For patients who visited both hospital and clinic, I want the index date to be the first visit of hospital.&lt;/P&gt;&lt;P&gt;For patients who only visited clinics, I want the index date to be the first visit of clinic.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So the new table I want would look something like this&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE border="0" cellspacing="0" cellpadding="0"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;ID&lt;/TD&gt;&lt;TD&gt;Visit&lt;/TD&gt;&lt;TD&gt;Date&lt;/TD&gt;&lt;TD&gt;Index Date&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060505&lt;/TD&gt;&lt;TD&gt;20060506&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20060506&lt;/TD&gt;&lt;TD&gt;20060506&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20061217&lt;/TD&gt;&lt;TD&gt;20060506&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060301&lt;/TD&gt;&lt;TD&gt;20060301&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060305&lt;/TD&gt;&lt;TD&gt;20060301&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20070503&lt;/TD&gt;&lt;TD&gt;20060301&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20070506&lt;/TD&gt;&lt;TD&gt;20060301&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20100505&lt;/TD&gt;&lt;TD&gt;20060301&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20061112&lt;/TD&gt;&lt;TD&gt;20061112&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20080103&lt;/TD&gt;&lt;TD&gt;20081227&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20081012&lt;/TD&gt;&lt;TD&gt;20081227&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20081227&lt;/TD&gt;&lt;TD&gt;20081227&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20050325&lt;/TD&gt;&lt;TD&gt;20050412&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20050412&lt;/TD&gt;&lt;TD&gt;20050412&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;hospital&lt;/TD&gt;&lt;TD&gt;20070510&lt;/TD&gt;&lt;TD&gt;20050412&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20061010&lt;/TD&gt;&lt;TD&gt;20050412&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20061231&lt;/TD&gt;&lt;TD&gt;20050412&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;5&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20070125&lt;/TD&gt;&lt;TD&gt;20050412&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;6&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060718&lt;/TD&gt;&lt;TD&gt;20060718&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;6&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20060817&lt;/TD&gt;&lt;TD&gt;20060718&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;6&lt;/TD&gt;&lt;TD&gt;clinic&lt;/TD&gt;&lt;TD&gt;20070918&lt;/TD&gt;&lt;TD&gt;20060718&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I tried using DO loop but seems like I just can't get it right.&lt;/P&gt;&lt;P&gt;How to solve this?!&lt;/P&gt;</description>
      <pubDate>Wed, 20 May 2020 06:17:18 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-create-a-variable-based-on-another-variables-per-ID/m-p/648500#M19216</guid>
      <dc:creator>lizwarr</dc:creator>
      <dc:date>2020-05-20T06:17:18Z</dc:date>
    </item>
    <item>
      <title>Data profiling by referencing the patterns and datatype from a lookup metadata table in dataflux?</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Data-profiling-by-referencing-the-patterns-and-datatype-from-a/m-p/648495#M19215</link>
      <description>&lt;P&gt;So the requirement is not just profiling and exploring the data but to be able to highlight the incorrect values by referencing a metadata table which has pattern , datatype, length everything mentioned for each field.&lt;/P&gt;&lt;P&gt;I could find any way to do it in dataflux other than writing my own SAS codes.&lt;/P&gt;&lt;P&gt;Let me know the feasibility please'.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 18 May 2020 08:12:55 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Data-profiling-by-referencing-the-patterns-and-datatype-from-a/m-p/648495#M19215</guid>
      <dc:creator>ShikhaAgarwal</dc:creator>
      <dc:date>2020-05-18T08:12:55Z</dc:date>
    </item>
    <item>
      <title>How do I configure SCD Type 2 Loader to have VALID END = Next Valid Start -1 Second?</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-do-I-configure-SCD-Type-2-Loader-to-have-VALID-END-Next/m-p/647859#M19214</link>
      <description>&lt;P&gt;I am using SAS Data Integration Studio and for my jobs with SCD Type 2 loader, I want my VALID END DTTM to be 1 sec before the next VALID START DTTM.&lt;/P&gt;&lt;P&gt;Example as below:&lt;/P&gt;&lt;P&gt;ID&amp;nbsp; &amp;nbsp;Name&amp;nbsp; &amp;nbsp;VALID_START_DTTM&amp;nbsp; &amp;nbsp; VALID_END_DTTM&lt;/P&gt;&lt;P&gt;1&amp;nbsp; &amp;nbsp; BEN&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;01JAN2011:00:00:00&amp;nbsp; &amp;nbsp; 01JAN5999:00:00:00&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Ben changed his name to Ali. Therefore, we will expire his name and append a new record as below (This is what I want to achieve)&lt;/P&gt;&lt;P&gt;ID&amp;nbsp; &amp;nbsp;Name&amp;nbsp; &amp;nbsp;VALID_START_DTTM&amp;nbsp; &amp;nbsp; VALID_END_DTTM&lt;/P&gt;&lt;P&gt;1&amp;nbsp; &amp;nbsp; BEN&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;01JAN2011:00:00:00&amp;nbsp; &amp;nbsp; &amp;nbsp;31JAN2011:23:59:59&lt;/P&gt;&lt;P&gt;1&amp;nbsp; &amp;nbsp; ALI&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;01FEB2011:00:00:00&amp;nbsp; &amp;nbsp; &amp;nbsp;01JAN5999:00:00:00&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As we can see, VALID_START_DTTM of ALI is just 1 sec after BEN's VALID_END_DTTM. This is what i want to achieve.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The default SCD Type 2 Loader in SAS DI Studio will set VALID_START_DTTM of ALI and&amp;nbsp;BEN's VALID_END_DTTM to be at the same exact datetime when the previous/existing record expires.&lt;/P&gt;&lt;P&gt;Example for the existing type 2 loader(that i do not want)&lt;/P&gt;&lt;P&gt;ID&amp;nbsp; &amp;nbsp;Name&amp;nbsp; &amp;nbsp;VALID_START_DTTM&amp;nbsp; &amp;nbsp; VALID_END_DTTM&lt;/P&gt;&lt;P&gt;1&amp;nbsp; &amp;nbsp; BEN&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;01JAN2011:00:00:00&amp;nbsp; &amp;nbsp; 01FEB2011:00:00:00&lt;/P&gt;&lt;P&gt;1&amp;nbsp; &amp;nbsp; ALI&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;01FEB2011:00:00:00&amp;nbsp; &amp;nbsp; &amp;nbsp;01JAN5999:00:00:00&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is there anyway to do so?&lt;/P&gt;</description>
      <pubDate>Thu, 14 May 2020 15:55:08 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-do-I-configure-SCD-Type-2-Loader-to-have-VALID-END-Next/m-p/647859#M19214</guid>
      <dc:creator>WorkingMan</dc:creator>
      <dc:date>2020-05-14T15:55:08Z</dc:date>
    </item>
    <item>
      <title>Import CSV : Make Slash information as a new lilne</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Import-CSV-Make-Slash-information-as-a-new-lilne/m-p/647734#M19208</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;I have a csv file such as some rows are like this (always on the second columns) :&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE style="border-collapse: collapse; width: 300pt;" border="0" width="400" cellspacing="0" cellpadding="0"&gt;
&lt;TBODY&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="46.4px" height="20" align="right" style="height: 15.0pt; width: 60pt;"&gt;6088&lt;/TD&gt;
&lt;TD width="202.4px" style="width: 60pt;"&gt;06000/06100/06200/06300&lt;/TD&gt;
&lt;TD width="72px" style="width: 60pt;"&gt;BLABLA&lt;/TD&gt;
&lt;TD width="59.2px" style="width: 60pt;"&gt;BLIBLI&lt;/TD&gt;
&lt;TD width="75.2px" style="width: 60pt;"&gt;BLOBLO&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;And i'd like that when I import it on SAS, I obtain (for this piece of information) :&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE style="border-collapse: collapse;" border="0" width="519px" cellspacing="0" cellpadding="0"&gt;
&lt;TBODY&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="78px" height="20" align="right" style="height: 15.0pt; width: 60pt;"&gt;
&lt;P&gt;6088&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="202px" style="width: 60pt;"&gt;06000&lt;/TD&gt;
&lt;TD width="80px" style="width: 60pt;"&gt;BLABLA&lt;/TD&gt;
&lt;TD width="79px" style="width: 60pt;"&gt;BLIBLI&lt;/TD&gt;
&lt;TD width="80px" style="width: 60pt;"&gt;
&lt;P&gt;BLOBLO&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="78px" height="20" align="right" style="height: 15.0pt; width: 60pt;"&gt;6088&lt;/TD&gt;
&lt;TD width="202px" style="width: 60pt;"&gt;06100&lt;/TD&gt;
&lt;TD width="80px" style="width: 60pt;"&gt;BLABLA&lt;/TD&gt;
&lt;TD width="79px" style="width: 60pt;"&gt;BLIBLI&lt;/TD&gt;
&lt;TD width="80px" style="width: 60pt;"&gt;BLOBLO&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="78px" height="20" align="right" style="height: 15.0pt; width: 60pt;"&gt;6088&lt;/TD&gt;
&lt;TD width="202px" style="width: 60pt;"&gt;06200&lt;/TD&gt;
&lt;TD width="80px" style="width: 60pt;"&gt;BLABLA&lt;/TD&gt;
&lt;TD width="79px" style="width: 60pt;"&gt;BLIBLI&lt;/TD&gt;
&lt;TD width="80px" style="width: 60pt;"&gt;BLOBLO&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="78px" height="20" align="right" style="height: 15.0pt; width: 60pt;"&gt;6088&lt;/TD&gt;
&lt;TD width="202px" style="width: 60pt;"&gt;06300&lt;/TD&gt;
&lt;TD width="80px" style="width: 60pt;"&gt;BLABLA&lt;/TD&gt;
&lt;TD width="79px" style="width: 60pt;"&gt;BLIBLI&lt;/TD&gt;
&lt;TD width="80px" style="width: 60pt;"&gt;BLOBLO&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Any idea ?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 14 May 2020 09:34:02 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Import-CSV-Make-Slash-information-as-a-new-lilne/m-p/647734#M19208</guid>
      <dc:creator>Mathis1</dc:creator>
      <dc:date>2020-05-14T09:34:02Z</dc:date>
    </item>
    <item>
      <title>read CSV with infile and with comma as decimal point</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/read-CSV-with-infile-and-with-comma-as-decimal-point/m-p/646718#M19203</link>
      <description>&lt;P&gt;Hi everybody,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I would like to read a CSV file into SAS (SAS 9.4) which has commas as decimal points. The number of decimal places are different. Here is an example of my csv file:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE style="border-collapse: collapse; width: 240pt;" border="0" width="141px" cellspacing="0" cellpadding="0"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD&gt;ID&lt;/TD&gt;
&lt;TD&gt;pop&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="57px" height="20" align="right" style="height: 15.0pt;"&gt;5119&lt;/TD&gt;
&lt;TD width="84px"&gt;77,09&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="57px" height="20" align="right" style="height: 15.0pt;"&gt;5120&lt;/TD&gt;
&lt;TD width="84px"&gt;74,52&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="57px" height="20" align="right" style="height: 15.0pt;"&gt;5122&lt;/TD&gt;
&lt;TD width="84px"&gt;89,54&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="57px" height="20" align="right" style="height: 15.0pt;"&gt;5124&lt;/TD&gt;
&lt;TD width="84px"&gt;168,39&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="57px" height="20" align="right" style="height: 15.0pt;"&gt;5154&lt;/TD&gt;
&lt;TD width="84px" align="right" class="lia-align-left"&gt;1233&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="57px" height="20" align="right" style="height: 15.0pt;"&gt;5158&lt;/TD&gt;
&lt;TD width="84px"&gt;407,21&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="57px" height="20" align="right" style="height: 15.0pt;"&gt;5162&lt;/TD&gt;
&lt;TD width="84px"&gt;576,42&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="57px" height="20" align="right" style="height: 15.0pt;"&gt;5166&lt;/TD&gt;
&lt;TD width="84px"&gt;563,2&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="57px" height="20" align="right" style="height: 15.0pt;"&gt;5170&lt;/TD&gt;
&lt;TD width="84px"&gt;1042,81&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am looking for a format like BEST12. but using commas instead of periods as decimal points. Does that exist? If not, is there another easy way to read these numbers properly without transforming the numbers to character variables and exchange "," to "."? (I have 200 of these variables, so i need something more "practical").&lt;/P&gt;
&lt;P&gt;Thank you very much in advance!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 11 May 2020 14:29:16 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/read-CSV-with-infile-and-with-comma-as-decimal-point/m-p/646718#M19203</guid>
      <dc:creator>marieK</dc:creator>
      <dc:date>2020-05-11T14:29:16Z</dc:date>
    </item>
    <item>
      <title>FCOPY in scheduler copies file as read only, but read/write when run manually</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/FCOPY-in-scheduler-copies-file-as-read-only-but-read-write-when/m-p/646251#M19194</link>
      <description>&lt;P&gt;Hello!!&lt;/P&gt;&lt;P&gt;I have a SAS program which when I run manually copies file with read/write access but when run in scheduler in SAS Server copies file only with read access.&lt;/P&gt;&lt;P&gt;How can I resolve it to copy files with read/write access even when run in SAS server scheduler?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;data _null_;&lt;/P&gt;&lt;P&gt;rc=fcopy('src', 'dest');&lt;/P&gt;&lt;P&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 08 May 2020 15:45:31 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/FCOPY-in-scheduler-copies-file-as-read-only-but-read-write-when/m-p/646251#M19194</guid>
      <dc:creator>AviS</dc:creator>
      <dc:date>2020-05-08T15:45:31Z</dc:date>
    </item>
    <item>
      <title>Uploading and copying file to and from SFTP site</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Uploading-and-copying-file-to-and-from-SFTP-site/m-p/646081#M19192</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;
&lt;P&gt;I am trying to upload as well as retrieve a file from my pc to the SFTP site. Can someone please tell me what will be the sas code for both of these.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks,&lt;/P&gt;</description>
      <pubDate>Fri, 08 May 2020 02:53:33 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Uploading-and-copying-file-to-and-from-SFTP-site/m-p/646081#M19192</guid>
      <dc:creator>mlogan</dc:creator>
      <dc:date>2020-05-08T02:53:33Z</dc:date>
    </item>
    <item>
      <title>"Merge" several variables</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/quot-Merge-quot-several-variables/m-p/645883#M19185</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;I would like to merge into one variable, the values of several variables provided that only one variable appears for each observation.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I will be more clear with the picture below.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Guillaume.PNG" style="width: 403px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/39164i400A9ACEABEE8A1C/image-size/large?v=1.0&amp;amp;px=999" title="Guillaume.PNG" alt="Guillaume.PNG" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I would like to put the values of Modele, Energie and Puissance, in one variable where the values follow each other. For instance :&lt;/P&gt;
&lt;P&gt;Modele 1&lt;/P&gt;
&lt;P&gt;Modele 2&lt;/P&gt;
&lt;P&gt;Modele 3&lt;/P&gt;
&lt;P&gt;Puissance 1&lt;/P&gt;
&lt;P&gt;Puissance 2&lt;/P&gt;
&lt;P&gt;Puissance 3&lt;/P&gt;
&lt;P&gt;Energie 1&lt;/P&gt;
&lt;P&gt;Energie 2&lt;/P&gt;
&lt;P&gt;Energie 3&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 07 May 2020 14:12:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/quot-Merge-quot-several-variables/m-p/645883#M19185</guid>
      <dc:creator>Mathis1</dc:creator>
      <dc:date>2020-05-07T14:12:05Z</dc:date>
    </item>
    <item>
      <title>Dynamic Data Quality Job to handle any type of Source File</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Dynamic-Data-Quality-Job-to-handle-any-type-of-Source-File/m-p/644676#M19180</link>
      <description>&lt;P&gt;Dear Users,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have a business scenario, where I will have to process lot of tables from source storage (files are in form of CSV) for standardization i.e. Demographic standardization etc.&lt;/P&gt;
&lt;P&gt;The DQ job has to understand the Columns and then pick the column which has demographic values (identification) and further standardize those columns.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The challenge are as below:&lt;/P&gt;
&lt;P&gt;1. The job has to identify the number of files in the folder location to be processed and the file names&lt;/P&gt;
&lt;P&gt;2. The job has to identify what columns are there in individual tables and which columns refer to demographic values&lt;/P&gt;
&lt;P&gt;3. Standardize the identified columns&lt;/P&gt;
&lt;P&gt;4. Loop the entire process.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The major challenge is that all the files in the source can have any number of columns and can consist of different columns as compared to the previous or next file in the source folder.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Any ideas on this will really be helpful.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;
&lt;P&gt;Abhishek Pathak&amp;nbsp;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 02 May 2020 09:50:08 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Dynamic-Data-Quality-Job-to-handle-any-type-of-Source-File/m-p/644676#M19180</guid>
      <dc:creator>avvy</dc:creator>
      <dc:date>2020-05-02T09:50:08Z</dc:date>
    </item>
    <item>
      <title>Merging SAS datasets</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Merging-SAS-datasets/m-p/643754#M19172</link>
      <description>&lt;P&gt;Hello Everyone&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have two SAS datasets&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;dataset 1.&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; dataset 2&lt;/P&gt;
&lt;P&gt;1.&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;4&lt;/P&gt;
&lt;P&gt;2&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 5&lt;/P&gt;
&lt;P&gt;3&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 6&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I want to stack these on top of each other so that the final dataset looks as follows.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Final dataset&lt;/P&gt;
&lt;P&gt;1&lt;/P&gt;
&lt;P&gt;2&lt;/P&gt;
&lt;P&gt;3&lt;/P&gt;
&lt;P&gt;4&lt;/P&gt;
&lt;P&gt;5&lt;/P&gt;
&lt;P&gt;6&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I appreciate your help.&lt;/P&gt;</description>
      <pubDate>Wed, 29 Apr 2020 03:21:21 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Merging-SAS-datasets/m-p/643754#M19172</guid>
      <dc:creator>GreenTree1</dc:creator>
      <dc:date>2020-04-29T03:21:21Z</dc:date>
    </item>
    <item>
      <title>Business Data Network - Importing data structure already existing in Metadata</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Business-Data-Network-Importing-data-structure-already-existing/m-p/643653#M19170</link>
      <description>&lt;P&gt;Question from the newbie in SAS:&lt;/P&gt;&lt;P&gt;- SAS9.4M6&lt;/P&gt;&lt;P&gt;- SAS BDN 3.3&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here my business requirement:&amp;nbsp;&lt;/P&gt;&lt;P&gt;I need to load an enterprise data model in SAS BDN (MSSQL, Oracle, mySQL Hadoop). BDN would become the main platform for data architect to manage the comapny datamodel.&lt;/P&gt;&lt;P&gt;To do so, I need to import all my tables already loaded in Mgt Console:&amp;nbsp;&lt;/P&gt;&lt;P&gt;- Tables: Table Name, term:table, Associated Items(table in meta)&lt;/P&gt;&lt;P&gt;- Columns: Column names, term column, Parent(tables) and associated items (from SAS meta)&lt;/P&gt;&lt;P&gt;All additional attributes will be manage manually by the business (data Steward)&lt;/P&gt;&lt;P&gt;(e.g. Description, Requirement, from term: Data Owner, Data Steward, Data Custodian, ...)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Base on my reseach, I start to believe there is no existing method proposed by SAS to perfor this load. I would need to build my own XML builder to prepare my data model import.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Question:&amp;nbsp;&lt;/P&gt;&lt;P&gt;1- Is there a more mature method to load this data if it already include in SAS metadata?&lt;/P&gt;&lt;P&gt;2- if not, what would be the best way to extract the metadata (with IDs) from a shared data if I would like to develop a script to "stitch" my schema to my SAS meta elements.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 28 Apr 2020 15:29:55 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Business-Data-Network-Importing-data-structure-already-existing/m-p/643653#M19170</guid>
      <dc:creator>Sommeit</dc:creator>
      <dc:date>2020-04-28T15:29:55Z</dc:date>
    </item>
    <item>
      <title>How to drop specific part of the text</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-drop-specific-part-of-the-text/m-p/643624#M19166</link>
      <description>&lt;P&gt;Hello,&amp;nbsp;&lt;SPAN style="font-family: inherit;"&gt;I came across the following problem in the example data:&lt;/SPAN&gt;&lt;/P&gt;&lt;DIV class="tlid-results-container results-container"&gt;&lt;DIV class="tlid-result result-dict-wrapper"&gt;&lt;DIV class="result tlid-copy-target"&gt;&lt;DIV class="text-wrap tlid-copy-target"&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;ID&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;003&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;0101&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;150&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;00070&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;My case is to get ID without any '0' only at the beginning, so my result should be:&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;ID&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;3&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;101&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;150&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;70&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV class="result-shield-container tlid-copy-target"&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;How can I achieve that?&amp;nbsp;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN class="tlid-translation translation"&gt;&lt;SPAN class=""&gt;Have a nice day &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Tue, 28 Apr 2020 14:52:01 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-drop-specific-part-of-the-text/m-p/643624#M19166</guid>
      <dc:creator>PatrykSAS</dc:creator>
      <dc:date>2020-04-28T14:52:01Z</dc:date>
    </item>
    <item>
      <title>multivaraite data to univariate with varaible names as count</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/multivaraite-data-to-univariate-with-varaible-names-as-count/m-p/642959#M19159</link>
      <description>&lt;P&gt;Hello everyone,&lt;/P&gt;&lt;P&gt;I have a data structured as&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="sapasupu_0-1587852503609.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/38709iC3D04DC381C3198F/image-size/medium?v=1.0&amp;amp;px=400" title="sapasupu_0-1587852503609.png" alt="sapasupu_0-1587852503609.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I want to convert it into univariate with a new variable as below&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="sapasupu_1-1587852607398.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/38710i4FED23DCED6AB506/image-size/medium?v=1.0&amp;amp;px=400" title="sapasupu_1-1587852607398.png" alt="sapasupu_1-1587852607398.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;How can I able to achieve that? Tried MV to UV conversion but the Item variable displays only the counts of the Items but not the variable names. Also, PROC TRANSPOSE didn't work. Any clues or help will be appreciated.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 25 Apr 2020 22:12:38 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/multivaraite-data-to-univariate-with-varaible-names-as-count/m-p/642959#M19159</guid>
      <dc:creator>sapasupu</dc:creator>
      <dc:date>2020-04-25T22:12:38Z</dc:date>
    </item>
    <item>
      <title>Recode daily date data</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Recode-daily-date-data/m-p/642774#M19155</link>
      <description>&lt;P&gt;Dear SAS users,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am trying to analyze daily data from a longitudinal study. I would like to code the calendar dates into study week and days (i.e., # of weeks and days they've been enrolled in the study since the first visit). All participants have varying start dates and varying number of days they were in the study. Currently the data I have look like this:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE border="1"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;StudyID&lt;/TD&gt;&lt;TD&gt;Date&lt;/TD&gt;&lt;TD&gt;Steps&lt;/TD&gt;&lt;TD&gt;Goal&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;4/20/2020&lt;/TD&gt;&lt;TD&gt;3001&lt;/TD&gt;&lt;TD&gt;3000&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;4/21/2020&lt;/TD&gt;&lt;TD&gt;2987&lt;/TD&gt;&lt;TD&gt;3000&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;4/22/2020&lt;/TD&gt;&lt;TD&gt;4313&lt;/TD&gt;&lt;TD&gt;3500&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;4/23/2020&lt;/TD&gt;&lt;TD&gt;4380&lt;/TD&gt;&lt;TD&gt;3500&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;3/13/2020&lt;/TD&gt;&lt;TD&gt;6098&lt;/TD&gt;&lt;TD&gt;6200&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;3/14/2020&lt;/TD&gt;&lt;TD&gt;7022&lt;/TD&gt;&lt;TD&gt;6200&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;3/15/2020&lt;/TD&gt;&lt;TD&gt;5980&lt;/TD&gt;&lt;TD&gt;6400&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;3/16/2020&lt;/TD&gt;&lt;TD&gt;5750&lt;/TD&gt;&lt;TD&gt;6400&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I would like for it to look like this:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE border="1"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;StudyID&lt;/TD&gt;&lt;TD&gt;StudyWeek&lt;/TD&gt;&lt;TD&gt;StudyDay&lt;/TD&gt;&lt;TD&gt;Steps&lt;/TD&gt;&lt;TD&gt;Goal&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;3001&lt;/TD&gt;&lt;TD&gt;3000&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2987&lt;/TD&gt;&lt;TD&gt;3500&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;4313&lt;/TD&gt;&lt;TD&gt;3500&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;4380&lt;/TD&gt;&lt;TD&gt;3000&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;6098&lt;/TD&gt;&lt;TD&gt;6200&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;7022&lt;/TD&gt;&lt;TD&gt;6200&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;5980&lt;/TD&gt;&lt;TD&gt;6400&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;4&lt;/TD&gt;&lt;TD&gt;5750&lt;/TD&gt;&lt;TD&gt;6400&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Typically, I would post code I have that I need help with, but I don't know where to even start to do this. Any help is very much appreciated!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Best,&lt;/P&gt;&lt;P&gt;Stephanie&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 24 Apr 2020 20:04:18 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Recode-daily-date-data/m-p/642774#M19155</guid>
      <dc:creator>srobinson5</dc:creator>
      <dc:date>2020-04-24T20:04:18Z</dc:date>
    </item>
    <item>
      <title>Document Conversion (.xlsx to .txt) in Data Management Studio 2.5</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Document-Conversion-xlsx-to-txt-in-Data-Management-Studio-2-5/m-p/642668#M19154</link>
      <description>&lt;P&gt;I am trying to write a quick Data Job that will convert an .xlxs file to a delimited .txt file.&lt;/P&gt;&lt;P&gt;I am using the&amp;nbsp;Document Conversion Node for my input file and Document Extraction node and then output to a text file.&lt;/P&gt;&lt;DIV class="mceNonEditable lia-copypaste-placeholder"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;P&gt;Attached is a sample of the output that has been converted to Excel.&amp;nbsp; I cannot figure out why the unique lines continuously repeat.&amp;nbsp; A file of 300,000 records explodes to several million as a result.&amp;nbsp; I reviewed the SAS documentation for this process and am doing everything as it says to.&amp;nbsp; Unfortunately, I cannot find help anywhere else.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I would appreciate any guidance on making this perform as I expect.&amp;nbsp; Thank you.&lt;/P&gt;&lt;DIV class="mceNonEditable lia-copypaste-placeholder"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 24 Apr 2020 16:47:24 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Document-Conversion-xlsx-to-txt-in-Data-Management-Studio-2-5/m-p/642668#M19154</guid>
      <dc:creator>CAHarbison</dc:creator>
      <dc:date>2020-04-24T16:47:24Z</dc:date>
    </item>
    <item>
      <title>The data specification could not be created</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/The-data-specification-could-not-be-created/m-p/642543#M19150</link>
      <description>&lt;P&gt;Hi I'm trying to import csv file from my local computer to model studio and create a forecasting project, but unfortunately an error message occur stated that my&amp;nbsp;data specification could not be created: The input data has no valid candidates for the time variable. Specify a table with a column that has a valid date format. Kindly assist me on how to solve the issues. Im sure that my csv file got the date column but somehow it was not recognized by SAS. Below is the picture of my dataset and error message.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="error.PNG" style="width: 999px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/38634i45E326073A2FD687/image-size/large?v=1.0&amp;amp;px=999" title="error.PNG" alt="error message" /&gt;&lt;span class="lia-inline-image-caption" onclick="event.preventDefault();"&gt;error message&lt;/span&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Capture.PNG" style="width: 399px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/38635i3C0FF81C6AC8DE2B/image-size/large?v=1.0&amp;amp;px=999" title="Capture.PNG" alt="dataset from my csv" /&gt;&lt;span class="lia-inline-image-caption" onclick="event.preventDefault();"&gt;dataset from my csv&lt;/span&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 24 Apr 2020 12:05:54 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/The-data-specification-could-not-be-created/m-p/642543#M19150</guid>
      <dc:creator>Aiman</dc:creator>
      <dc:date>2020-04-24T12:05:54Z</dc:date>
    </item>
    <item>
      <title>Output both matching and non-matching rows in a single table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Output-both-matching-and-non-matching-rows-in-a-single-table/m-p/642004#M19147</link>
      <description>&lt;P&gt;How can I adjust this working code so the observations in column "patient" from table "enroll" that don't match the observations in column "patientID" from table "calendar" are output into another table? In addition, would it be possible to output both matching observations and non-matching observations in the same table? The current working code only returns a table of matching observations based on "patient" and patientID", and "dateassigned" and "datetracked".&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data enroll;
    input patient status :$12. dateassigned &amp;amp;:anydtdte.;
    format date yymmdd10.;
    datalines;

500-001   enrolled    01-jan-2019      
500-002   enrolled    15-jan-2019     
500-003   removed     23-Jan-2019     
500-004   enrolled    05-feb-2019     
500-005   enrolled    17-feb-2019     
587-001   enrolled    20-feb-2019
587-002   enrolled    25-feb-2019
587-003   enrolled    03-mar-2019
594-001   enrolled    04-feb-2018
594-002   enrolled    09-feb-2018
648-001   enrolled    15-mar-2019
648-002   enrolled    22-mar-2019
648-003   enrolled    27-mar-2019
648-004   enrolled    30-mar-2019
;

data calendar;
    input visitnumber patientID :$12. datetracked &amp;amp;:anydtdte.;
    format date yymmdd10.;
    datalines;

500 500-001-rdf   01-jan-2019      
500 500-002-fgh   15-jan-2019     
500 500-003-ehd   23-Jan-2019     
500 500-004-ern   05-feb-2019     
500 500-005-qmd   17-feb-2019     
587 587-001-wcs   20-feb-2019
587 587-002-qlc   25-feb-2019
587 587-003-qhr   03-mar-2019
594 594-001-qwn   04-feb-2018
594 594-002-agj   09-feb-2018
648 648-001-wuf   15-mar-2019
648 648-002-qbf   22-mar-2019
648 648-003-olr   27-mar-2019
648 648-004-wmf   30-mar-2019
;

proc sql;
	create table want as
		select 
			enrl.*
			, cal.*
			from 
				enroll enrl
				, calendar cal
				where
					enrl.patient = substr(cal.patientID,1,8) and
					enrl.dateassigned = cal.datetracked
	;
quit;&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Wed, 22 Apr 2020 15:52:04 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Output-both-matching-and-non-matching-rows-in-a-single-table/m-p/642004#M19147</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-04-22T15:52:04Z</dc:date>
    </item>
    <item>
      <title>Set unmapped fields to null/different fields on input to output?</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Set-unmapped-fields-to-null-different-fields-on-input-to-output/m-p/641732#M19142</link>
      <description>&lt;P&gt;Hi, So I'm doing this in DI Studio although as User Written Code. Hopefully this makes sense, there's an example at the end to hopefully help.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The situation is each supplier needs different fields (from a selection of ~50) but all suppliers get the same output delimited file, just the fields they don't need are blank. I had it working when each supplier just needed their fields but we now need everyone to get the same template, with just their fields populated. For example one supplier may need everyones Name, Address, DoB, another might just need Names and DoBs etc. It'll be a different combination for each and even the order may change (field names will match though) but the final output will be a fixed order and field list.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I've setup the 1st part, so I have a UWC transformation with my full table as an input and a variable of the fields they need, it applies this (using a data step and keep) and gives me an output table of just the fields they are entitled to.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The second part I'm struggling with. So far I've tried a data step that sets the output with the full field list and all attribs which is fine, after that I use the below data step hoping it would just map the ones needed into my table and leave the rest as null, it didn't, it errors due to them being missing on the input:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data &amp;amp;_output.;
set &amp;amp;_input.;
run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;However trying to open the output table fails due to all the missing columns, what's needed is for it to do that and set all the unmapped fields to nulls. Essentially the process is:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Full Table &amp;gt; UWC - Filtered view of customer required fields &amp;gt; UWC - Map filtered list back, set non required to null &amp;gt; Full table (which will be exported to a delimited file), we can go straight to that and skip the final table if needed.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Any ideas thanks? Hopefully this helps, the filtered table will be different everytime:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;In Table (50 fields):&lt;/STRONG&gt;&lt;/P&gt;
&lt;TABLE&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD&gt;Name&lt;/TD&gt;
&lt;TD&gt;Address&lt;/TD&gt;
&lt;TD&gt;Postcode&lt;/TD&gt;
&lt;TD&gt;DoB&lt;/TD&gt;
&lt;TD&gt;Colour&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;John Smith&lt;/TD&gt;
&lt;TD&gt;1 made up street&lt;/TD&gt;
&lt;TD&gt;1MDUS&lt;/TD&gt;
&lt;TD&gt;01/01/2000&lt;/TD&gt;
&lt;TD&gt;Blue&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;Jane Doe&lt;/TD&gt;
&lt;TD&gt;2 Fake Place&lt;/TD&gt;
&lt;TD&gt;2FAPL&lt;/TD&gt;
&lt;TD&gt;31/12/2005&lt;/TD&gt;
&lt;TD&gt;Red&amp;nbsp;&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Filtered view (diff for each supplier and field order can change but names will always match, 1-50 fields)&lt;/STRONG&gt;&lt;/P&gt;
&lt;TABLE&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD&gt;Postcode&lt;/TD&gt;
&lt;TD&gt;DoB&lt;/TD&gt;
&lt;TD&gt;Name&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;1MDUS&lt;/TD&gt;
&lt;TD&gt;01/01/2000&lt;/TD&gt;
&lt;TD&gt;John Smith&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;2FAPL&lt;/TD&gt;
&lt;TD&gt;31/12/2005&lt;/TD&gt;
&lt;TD&gt;Jane Doe&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Output (50 fields), can be sas table, ultimately will be a delimited file output):&lt;/STRONG&gt;&lt;/P&gt;
&lt;TABLE&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD&gt;Name&lt;/TD&gt;
&lt;TD&gt;Address&lt;/TD&gt;
&lt;TD&gt;Postcode&lt;/TD&gt;
&lt;TD&gt;DoB&lt;/TD&gt;
&lt;TD&gt;Colour&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;John Smith&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;1MDUS&lt;/TD&gt;
&lt;TD&gt;01/01/2000&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;Jane Doe&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;2FAPL&lt;/TD&gt;
&lt;TD&gt;31/12/2005&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;</description>
      <pubDate>Tue, 21 Apr 2020 19:56:45 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Set-unmapped-fields-to-null-different-fields-on-input-to-output/m-p/641732#M19142</guid>
      <dc:creator>MRDM</dc:creator>
      <dc:date>2020-04-21T19:56:45Z</dc:date>
    </item>
    <item>
      <title>Join two tables together based on similar but not equal columns in SAS</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Join-two-tables-together-based-on-similar-but-not-equal-columns/m-p/641632#M19137</link>
      <description>&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data enroll;
    input patient status :$12. dateassigned &amp;amp;:anydtdte.;
    format date yymmdd10.;
    datalines;

500-001   enrolled    01-jan-2019      
500-002   enrolled    15-jan-2019     
500-003   removed     23-Jan-2019     
500-004   enrolled    05-feb-2019     
500-005   enrolled    17-feb-2019     
587-001   enrolled    20-feb-2019
587-002   enrolled    25-feb-2019
587-003   enrolled    03-mar-2019
594-001   enrolled    04-feb-2018
594-002   enrolled    09-feb-2018
648-001   enrolled    15-mar-2019
648-002   enrolled    22-mar-2019
648-003   enrolled    27-mar-2019
648-004   enrolled    30-mar-2019
;

data calendar;
    input visitnumber patientID :$12. datetracked &amp;amp;:anydtdte.;
    format date yymmdd10.;
    datalines;

500 500-001-rdf   01-jan-2019      
500 500-002-fgh   15-jan-2019     
500 500-003-ehd   23-Jan-2019     
500 500-004-ern   05-feb-2019     
500 500-005-qmd   17-feb-2019     
587 587-001-wcs   20-feb-2019
587 587-002-qlc   25-feb-2019
587 587-003-qhr   03-mar-2019
594 594-001-qwn   04-feb-2018
594 594-002-agj   09-feb-2018
648 648-001-wuf   15-mar-2019
648 648-002-qbf   22-mar-2019
648 648-003-olr   27-mar-2019
648 648-004-wmf   30-mar-2019
;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;Hi all,&amp;nbsp;&lt;SPAN&gt;I am trying to join these two these tables together. I want the columns to align where the column “patient” in table “enroll” matches the column “patientID” in table “calendar”. "patient" and "patientID" are similar but not equal. In addition, the column “dateassigned” in table “enroll” should match the column “datetracked” in table “calendar”. Since this is only a snippet of a larger dataset, the columns should align together. However, this may not be the case in the larger dataset, so I am trying to highlight where the columns from both tables do not align. Thank you in advance.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 21 Apr 2020 14:26:12 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Join-two-tables-together-based-on-similar-but-not-equal-columns/m-p/641632#M19137</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-04-21T14:26:12Z</dc:date>
    </item>
    <item>
      <title>SAS Viya Data Explorer</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/SAS-Viya-Data-Explorer/m-p/639275#M19127</link>
      <description>&lt;P&gt;Can anyone suggest how to increase the size of data explorer? I have attached the&amp;nbsp; document for it&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="dx.png" style="width: 999px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/38238i2E673BFA3EC840B2/image-size/large?v=1.0&amp;amp;px=999" title="dx.png" alt="dx.png" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Sun, 12 Apr 2020 15:25:03 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/SAS-Viya-Data-Explorer/m-p/639275#M19127</guid>
      <dc:creator>suyashucd</dc:creator>
      <dc:date>2020-04-12T15:25:03Z</dc:date>
    </item>
    <item>
      <title>Importing multiple CSV files using generic SAS code</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Importing-multiple-CSV-files-using-generic-SAS-code/m-p/638583#M19115</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I'm trying to import multiple csv files in SAS DI Studio 4.9. Each csv file has different layout and these contain a trailer record at the end of each file. The layout of the trailer record is common across each file which contain the information like number of records in the file, time of generation etc. Preference is to create only 1 job to for this purpose and not have multiple branches of transformation in the job for each type of csv file. I believe it is feasible to read a csv with unknown column layout using PROC IMPORT but the trailer record is causing issues.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is it feasible to have a generic code to read multiple csv files with trailer records?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Jinender&lt;/P&gt;</description>
      <pubDate>Thu, 09 Apr 2020 05:04:02 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Importing-multiple-CSV-files-using-generic-SAS-code/m-p/638583#M19115</guid>
      <dc:creator>jinendergulati</dc:creator>
      <dc:date>2020-04-09T05:04:02Z</dc:date>
    </item>
    <item>
      <title>Merging datasets</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Merging-datasets/m-p/638488#M19111</link>
      <description>&lt;P&gt;How can I set these data sets with names: Fit_1, Fit_2, Fit_5, ..., Fit_1000.&lt;/P&gt;&lt;P&gt;I can not run with this:&lt;/P&gt;&lt;P&gt;##############################&lt;/P&gt;&lt;P&gt;data fit;&lt;/P&gt;&lt;P&gt;set Fit_1 - Fit_100;&lt;/P&gt;&lt;P&gt;output;&lt;/P&gt;&lt;P&gt;run;&lt;/P&gt;&lt;P&gt;##############################&lt;/P&gt;&lt;P&gt;Error: work.fit_3 does not exist.&lt;/P&gt;</description>
      <pubDate>Wed, 08 Apr 2020 20:41:57 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Merging-datasets/m-p/638488#M19111</guid>
      <dc:creator>mezerji</dc:creator>
      <dc:date>2020-04-08T20:41:57Z</dc:date>
    </item>
    <item>
      <title>Return a table based on simply query</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Return-a-table-based-on-simply-query/m-p/638444#M19109</link>
      <description>&lt;P&gt;Hi, I am trying to perform a query that returns the dates less than/prior to the current date in the column "date". This should form a new column called "past".&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data have;
input Subject Type :$12. Date &amp;amp;:anydtdte.;
format date yymmdd10.;
datalines;

500   Initial    15 AUG 2017      
500   Initial    15 AUG 2017    
500   Followup   15 AUG 2018    
428   Followup    15 AUG 2018     
765   Seventh     3 AUG 2018      
500   Followup    3 JUL 2018      
428   Initial     3 JUL 2017    
765   Initial     20 JUL 2019   
610   Third       20 AUG 2019    
610   Initial     17 Mar 2018    
;

&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Wed, 08 Apr 2020 19:06:41 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Return-a-table-based-on-simply-query/m-p/638444#M19109</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-04-08T19:06:41Z</dc:date>
    </item>
    <item>
      <title>Filtering values prior to query</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Filtering-values-prior-to-query/m-p/638412#M19108</link>
      <description>&lt;P&gt;Hi, I want to calculate the current value divided by the minimum value in column "measurement" from visits prior to the current visit in column "type". However, first I need to use the column "jsw" to filter&lt;SPAN&gt;&amp;nbsp;for visits prior to the current visit value, and then calculate the minimum from those prior visits. This would require first searching for the minimum" jsw" value, then looking for records specific to the where "jsw" is less than the minimum "jsw" value found and then from that subset of records, pull the minimum "measurement" value for the division calculation. However, I cannot figure out to implement this in my current working code. Any solutions to this matter would be appreciated.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data have;
	input Subject Type :$12. Date &amp;amp;:anydtdte. jsw procedure :$12. measurement;
	format date yymmdd10.;
	datalines;

Subject Type      Date                jsw                procedure     measurement      
500   Initial    15 AUG 2017            6                   Invasive        20        
500   Initial    15 AUG 2017            9                   Surface         35        
500   Followup   15 AUG 2018            8                   Invasive        54       
428   Followup    15 AUG 2018          56                   Outer           29        
765   Seventh     3 AUG 2018           12                   Other           13     
500   Followup    3 JUL 2018           23                   surface         98    
428   Initial     3 JUL 2017           34                   Outer           10    
765   Initial     20 JUL 2019          4                    Other           19     
610   Third       20 AUG 2019         58                    Invasive        66   
610   Initial     17 Mar 2018         25                    Invasive        17    

PROC SQL;
    create table want as
    select a.*,
        ((select measurement as lastmeasurement
         from have
         where Subject = a.Subject and procedure= a.procedure 
         having jsw = max(jsw)) -min(a.measurement))/ min(a.measurement)*100 as WantedPercent
    from have as a
    group by Subject, type
    order by Subject, type, date;
QUIT;&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Wed, 08 Apr 2020 17:40:49 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Filtering-values-prior-to-query/m-p/638412#M19108</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-04-08T17:40:49Z</dc:date>
    </item>
    <item>
      <title>Dividing within a column based on other columns matching</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Dividing-within-a-column-based-on-other-columns-matching/m-p/638192#M19099</link>
      <description>&lt;P&gt;Hi all, I am trying to calculate the current measurement in column “Total" minus the lowest measurement previously recorded in column "Total" where the current measurement in column “Total” corresponding to the value in column "Trade" is less than(&amp;lt;) the minimum measurement in column “Total” &amp;nbsp;corresponding to the value in column "trade", and if &amp;nbsp;two values in the “SUBJECT” column match and two values in the “PROCEDURE” column match. To emphasize, the minimum value must be a previously recorded. If the measurement is less than the current measurement but was not recorded previously (according to the “date” column), it does not quality to be subtracted from the current measurement. An example of the output is provided below.&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data Have;
	input Subject Type :$12. Date &amp;amp;:anydtdte. Trade Procedure :$12. Measurement;
	format date yymmdd10.;
	datalines;

Subject Type      Date                Trade                Procedure     Total      
500   Initial    15 AUG 2017            6                   Invasive        20        
500   Initial    15 AUG 2017            9                   Surface         35        
500   Followup   15 AUG 2018            8                   Invasive        54       
428   Followup    15 AUG 2018          56                   Outer           29        
765   Seventh     3 AUG 2018           12                   Other           13     
500   Followup    3 JUL 2018           23                   surface         98    
428   Initial     3 JUL 2017           34                   Outer           10    
765   Initial     20 JUL 2019          4                    Other           19     
610   Third       20 AUG 2019         58                    Invasive        66   
610   Initial     17 Mar 2018         25                    Invasive        17     

*Example of Output;

Subject Type      Date                Trade                Procedure     Total        Output
500   Initial    15 AUG 2017            6                   Invasive        20        20/20
500   Initial    15 AUG 2017            9                   Surface         35        35/35
500   Followup   15 AUG 2018            8                   Invasive        54        54/20
428   Followup    15 AUG 2018          56                   Outer           29       29/10
765   Seventh     3 AUG 2018           12                   Other           13       13/19
500   Followup    3 JUL 2018           23                   surface         98       98/35
428   Initial     3 JUL 2017           34                   Outer           10       10/10
765   Initial     20 JUL 2019           4                   Other           19       19/19
610   Third       20 AUG 2019          58                   Invasive        66       66/17
610   Initial     17 Mar 2018          25                   Invasive        17       17/17
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Tue, 07 Apr 2020 23:39:18 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Dividing-within-a-column-based-on-other-columns-matching/m-p/638192#M19099</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-04-07T23:39:18Z</dc:date>
    </item>
    <item>
      <title>Conditional PROC SQL</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Conditional-PROC-SQL/m-p/638110#M19093</link>
      <description>&lt;P&gt;Hi there!&lt;/P&gt;&lt;P&gt;I am trying to call a PROC SQL or another based on an input variable.&lt;/P&gt;&lt;P&gt;My input variable is like YYYYMM, and if MM is equal to 01 then I do a new table else I need to merge with the table from the month before.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is there anyway to do this?&lt;/P&gt;&lt;P&gt;I tried but I keep failing it doesn't create a table.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks in advance.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The code I tried was something like:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;%let month=%substr(&amp;amp;delta,5,2);
%macro january;
proc sql;
	create table WORK.RH_SOLV_RECEITA_EMITIDA_&amp;amp;delta as 
 	select *
		from WORK.SOLV_RECEITA_EMITIDA_MES;
quit;
%mend janeiro; 
%macro outros;
proc sql;
	create table WORK.RH_SOLV_RECEITA_EMITIDA_&amp;amp;delta as 
	select * FROM WORK.SOLV_RECEITA_EMITIDA_OLD_MES
	outer union corr 
	select * FROM WORK.SOLV_RECEITA_EMITIDA_MES;
quit;
%mend outros;
data _null_;
	if &amp;amp;month = '01' then call execute('%janeiro');
	if &amp;amp;delta &amp;lt;&amp;gt; '01' then call execute('%outros');
run;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 07 Apr 2020 15:34:00 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Conditional-PROC-SQL/m-p/638110#M19093</guid>
      <dc:creator>RicHen</dc:creator>
      <dc:date>2020-04-07T15:34:00Z</dc:date>
    </item>
    <item>
      <title>How to join two columns from one table to a different table based matching criteria</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-join-two-columns-from-one-table-to-a-different-table/m-p/637964#M19089</link>
      <description>&lt;P&gt;Hi, I am trying to join the columns "Type2" and "Measurement2" from table "Update" to the table "Have". I want the columns to align where column "Subject1" in table "Have" matches column "Subject2" in table "update", and column "Procedure1" in table "Have" matches column "Procedure2" in table "Update". Thank you in advance.&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data Have;
	input Subject1 Type1 :$12. Date1 &amp;amp;:anydtdte. Procedure1 :$12. Measurement1;
	format date yymmdd10.;
	datalines;

500   Initial    15 AUG 2017      Invasive    20 
500   Initial    15 AUG 2017     Surface      35   
428   Initial     3 JUL 2017     Outer        10 
765   Initial     20 JUL 2019     Other       19  
610   Initial     17 Mar 2018     Invasive    17 
;

data Update;
	input Subject2 Type2 :$12. Date2 &amp;amp;:anydtdte. Procedure2 :$12. Measurement2;
	format date yymmdd10.;
	datalines;

500   Followup   15 AUG 2018     Invasive     54 
428   Followup    15 AUG 2018      Outer      29 
765   Seventh     3 AUG 2018      Other       13 
500   Followup    3 JUL 2018      Surface     98 
610   Third       20 AUG 2019     Invasive    66  
;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 07 Apr 2020 00:15:04 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-join-two-columns-from-one-table-to-a-different-table/m-p/637964#M19089</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-04-07T00:15:04Z</dc:date>
    </item>
    <item>
      <title>Dataflux Conditional Expression with ODBC connection</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Dataflux-Conditional-Expression-with-ODBC-connection/m-p/637856#M19087</link>
      <description>&lt;P&gt;Hi all,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am new in Dataflux and I am trying to figure out how to do a conditional logic with an ODBC connection using multiple columns from the external data source. There are multiple columns in the RULES table and I need to use them in a conditional logic.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This is what I have so far.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;static dbconnection db1
static dbstatement stmt1
dbcursor curs1
boolean retval1

 if isnull(db1) then
   db1 =  dbconnect('DSN=MS Access Database;DFXTYPE=ODBC')
if isnull(stmt1) then
begin
   stmt1 = db1.prepare("SELECT * FROM "RULES" WHERE ISALPHA = ?")
   stmt1.setparaminfo(0, 'string', 5)
end

 
&amp;lt;&amp;lt;missing comparison&amp;gt;&amp;gt;

 
 
if retval1 == true then
return true
else
return false&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 06 Apr 2020 14:43:53 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Dataflux-Conditional-Expression-with-ODBC-connection/m-p/637856#M19087</guid>
      <dc:creator>Vinz867</dc:creator>
      <dc:date>2020-04-06T14:43:53Z</dc:date>
    </item>
    <item>
      <title>Update/insert the record and maintain the history</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Update-insert-the-record-and-maintain-the-history/m-p/637658#M19080</link>
      <description>&lt;P&gt;We've a process where clients will update the file (say sales details) when there is a need and they intimate us about the change. So our job is to load those updated records to SQL DB table by maintaing the history.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;For the first time while loading the data to SQL we just did a 'proc append' to SQL table after reading the file via data step infile Statement. However this method will not work to track the history in SQL when there is a update in the file.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I'm not Aware of SCDs as well.&amp;nbsp; We are not using any Primary/foreign key stuff in SQL.I would like to understand how I can tackle the situation to update the history. I tried with 'Proc SQL update' and 'If first. and last.' but it didn't helped either. Appericiate your help here. I'm OK with any approach to implement this exercise.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Assume I've a file like this.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE border="0" width="576" cellspacing="0" cellpadding="0"&gt;&lt;COLGROUP&gt;&lt;COL span="8" width="72" /&gt;&lt;/COLGROUP&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="72" height="20"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Id&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;&amp;nbsp;Name&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Sales&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Transaction&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Example Values&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Act_Ind&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Valid_From&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Valid_To&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD height="19" align="right"&gt;&lt;FONT color="#000000"&gt;1&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Unit&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;True&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;6&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;58001&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Y&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;19000101T000000&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;99991231T235959&lt;/FONT&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD height="19" align="right"&gt;&lt;FONT color="#000000"&gt;2&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Key&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;True&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;6&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;121216&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Y&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;19000101T000000&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;99991231T235959&lt;/FONT&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD height="19" align="right"&gt;&lt;FONT color="#000000"&gt;3&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Value&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;True&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;18&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;820595,2&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Y&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;19000101T000000&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;99991231T235959&lt;/FONT&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Assume I've received an update to the file like this. If you notice Transaction value has changed for Key from 6 to 8.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE border="0" width="576" cellspacing="0" cellpadding="0"&gt;&lt;COLGROUP&gt;&lt;COL span="8" width="72" /&gt;&lt;/COLGROUP&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="72px" height="20"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Id&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="70.4px"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;&amp;nbsp;Name&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="47.2px"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Sales&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="64.8px"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Transaction&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="76.8px"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Example Values&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="69.6px"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Act_Ind&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="145.6px"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Valid_From&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="145.6px"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Valid_To&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="72px" height="19" align="right"&gt;&lt;FONT color="#000000"&gt;2&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD width="70.4px"&gt;&lt;FONT color="#000000"&gt;Key&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD width="47.2px"&gt;&lt;FONT color="#000000"&gt;True&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD width="64.8px" align="right"&gt;&lt;FONT color="#000000"&gt;8&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD width="76.8px" align="right"&gt;&lt;FONT color="#000000"&gt;121216&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD width="69.6px"&gt;&lt;FONT color="#000000"&gt;Y&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD width="145.6px"&gt;&lt;FONT color="#000000"&gt;20200404T000000&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD width="145.6px"&gt;&lt;FONT color="#000000"&gt;99991231T235959&lt;/FONT&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;So now I want data like this SQL table.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE border="0" width="576" cellspacing="0" cellpadding="0"&gt;&lt;COLGROUP&gt;&lt;COL span="8" width="72" /&gt;&lt;/COLGROUP&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="72" height="20"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Id&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;&amp;nbsp;Name&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Sales&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Transaction&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Example Values&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Act_Ind&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Valid_From&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="72"&gt;&lt;STRONG&gt;&lt;FONT color="#000000"&gt;Valid_To&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD height="19" align="right"&gt;&lt;FONT color="#000000"&gt;1&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Unit&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;True&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;6&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;58001&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Y&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;19000101T000000&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;99991231T235959&lt;/FONT&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD height="19" align="right"&gt;&lt;FONT color="#000000"&gt;2&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Key&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;True&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;6&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;121216&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;N&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;19000101T000000&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;20200403T000000&lt;/FONT&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD height="19" align="right"&gt;&lt;FONT color="#000000"&gt;2&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Key&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;True&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;8&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;121216&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Y&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;20200404T000000&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;99991231T235959&lt;/FONT&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD height="19" align="right"&gt;&lt;FONT color="#000000"&gt;3&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Value&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;True&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;18&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD align="right"&gt;&lt;FONT color="#000000"&gt;820595,2&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;Y&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;19000101T000000&lt;/FONT&gt;&lt;/TD&gt;
&lt;TD&gt;&lt;FONT color="#000000"&gt;99991231T235959&lt;/FONT&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;</description>
      <pubDate>Sun, 05 Apr 2020 08:20:17 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Update-insert-the-record-and-maintain-the-history/m-p/637658#M19080</guid>
      <dc:creator>David_Billa</dc:creator>
      <dc:date>2020-04-05T08:20:17Z</dc:date>
    </item>
    <item>
      <title>Calculation of values that rely on a date variable</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Calculation-of-values-that-rely-on-a-date-variable/m-p/637309#M19075</link>
      <description>&lt;P&gt;Hi, I am trying to calculate t&lt;SPAN&gt;he value of the last&amp;nbsp;measurement&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;taken (according to the date column) divided by the&amp;nbsp;lowest&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;value recorded (according to the measurement column)&amp;nbsp;&amp;nbsp;if two values in the “SUBJECT” column match and two values in the “PROCEDURE” column match. The the calculation would be produced in a new column. I am having trouble with this and I would appreciate a solution to this matter.&lt;/SPAN&gt;&lt;/P&gt;&lt;PRE class="language-sas"&gt;&lt;CODE&gt;data Have;
	input Subject Type :$12. Date &amp;amp;:anydtdte. Procedure :$12. Measurement;
	format date yymmdd10.;
	datalines;

500   Initial    15 AUG 2017      Invasive    20 
500   Initial    15 AUG 2017     Surface      35 
500   Followup   15 AUG 2018     Invasive     54 
428   Followup    15 AUG 2018      Outer      29 
765   Seventh     3 AUG 2018      Other       13 
500   Followup    3 JUL 2018      Surface     98 
428   Initial     3 JUL 2017     Outer        10 
765   Initial     20 JUL 2019     Other       19 
610   Third       20 AUG 2019     Invasive    66 
610   Initial     17 Mar 2018     Invasive    17 
;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 03 Apr 2020 17:12:42 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Calculation-of-values-that-rely-on-a-date-variable/m-p/637309#M19075</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-04-03T17:12:42Z</dc:date>
    </item>
    <item>
      <title>VLOOKUP dataset with file</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/VLOOKUP-dataset-with-file/m-p/637154#M19072</link>
      <description>&lt;P&gt;I have been trying to solve this for awhile and I really need help with DF.&lt;/P&gt;&lt;P&gt;I have file with 5 columns, I need to validate this file against a whitelist file using an expression.&lt;/P&gt;&lt;P&gt;I cannot do a left or right join because it will create additional rows that are not in the original file.&lt;/P&gt;&lt;P&gt;Inner join will remove rows that I need. I also tried a seekbegin() to the whitelist file but I don't think I'm doing it right. Any help will be appreciated!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Pre-expression:&lt;/P&gt;&lt;P&gt;string VALIDITY&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;file f&lt;BR /&gt;string curr_line&lt;BR /&gt;string COLUMNA&lt;BR /&gt;string COLUMNB&lt;BR /&gt;integer COLUMNC&lt;/P&gt;&lt;P&gt;f.open("\\whitelist.csv","r")&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here is the expression:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;begin&lt;BR /&gt;f.seekbegin(0)&lt;BR /&gt;curr_line= f.readline()&lt;BR /&gt;while (curr_line)&lt;BR /&gt;parse(COLUMNA,",",COLUMNB,",",COLUMNC)&lt;BR /&gt;begin&lt;BR /&gt;curr_line= f.readline()&lt;BR /&gt;while isnull(VALIDITY)&lt;BR /&gt;begin&lt;BR /&gt;&amp;lt;conditional logic&amp;gt;&lt;BR /&gt;then VALIDITY = 'VALID'&lt;BR /&gt;else VALIDITY = 'INVALID'&lt;BR /&gt;end&amp;nbsp;&lt;BR /&gt;end&lt;BR /&gt;end&lt;/P&gt;</description>
      <pubDate>Fri, 03 Apr 2020 13:07:48 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/VLOOKUP-dataset-with-file/m-p/637154#M19072</guid>
      <dc:creator>Vinz867</dc:creator>
      <dc:date>2020-04-03T13:07:48Z</dc:date>
    </item>
    <item>
      <title>Creating a new variable using two dataset with different var names and observations</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Creating-a-new-variable-using-two-dataset-with-different-var/m-p/636533#M19057</link>
      <description>&lt;P&gt;Dear all,&lt;/P&gt;&lt;P&gt;Assume there are two datasets as follow:&lt;/P&gt;&lt;P&gt;data one;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; input A B;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; cards;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; 4 6&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; 3 2&lt;/P&gt;&lt;P&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;data two;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; input C D;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; cards;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; 1 3&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; 2 5&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; 3 8&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; 4 7&lt;/P&gt;&lt;P&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As you see neither variables nor obs are equal.&lt;/P&gt;&lt;P&gt;I would like to create another variable E in dataset one which is equal to var D where A=C. Finally, I am looking for a dataset for example like this:&lt;/P&gt;&lt;P&gt;A B C D E&lt;/P&gt;&lt;P&gt;4 6 . . 7&lt;/P&gt;&lt;P&gt;3 2 . . 8&lt;/P&gt;&lt;P&gt;. .&amp;nbsp; 1 3 .&lt;/P&gt;&lt;P&gt;. .&amp;nbsp; 2 5 .&lt;/P&gt;&lt;P&gt;. .&amp;nbsp; 3 8 .&lt;/P&gt;&lt;P&gt;&amp;nbsp;. . 4 7 .&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So I can do some analyses on A and E.&lt;/P&gt;&lt;P&gt;Please let me know if I can use SAS for this reason.&lt;/P&gt;&lt;P&gt;Thank you!&lt;/P&gt;</description>
      <pubDate>Wed, 01 Apr 2020 13:26:38 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Creating-a-new-variable-using-two-dataset-with-different-var/m-p/636533#M19057</guid>
      <dc:creator>mghamari63</dc:creator>
      <dc:date>2020-04-01T13:26:38Z</dc:date>
    </item>
    <item>
      <title>SQL to Dataflux (NOOB)</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/SQL-to-Dataflux-NOOB/m-p/636299#M19055</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I need some help converting this SQL to a Dataflux expression. I am trying to do a comparison between 2 files and old code in SQL used this to compare it.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;LEFT(LTRIM(RTRIM(cast(COLUMNA as varchar(1000)))),LEN(COLUMNB)) = COLUMNA&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;LEFT(LTRIM(RTRIM(SUBSTRING( LTRIM(RTRIM(CAST(COLUMNA as varchar(1000)))) , LEN(COLUMNC) + 1&amp;nbsp; ,LEN(cast(COLUMNA as varchar(1000)))-&amp;nbsp; (LEN(@COLUMNC) + 1)&amp;nbsp; ))),LEN(COLUMNB)&amp;nbsp;= COLUMNA&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Column A is from&amp;nbsp; File A&lt;/P&gt;&lt;P&gt;Column B is from File B.&lt;/P&gt;&lt;P&gt;Column C is from File B.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Please help!&lt;/P&gt;</description>
      <pubDate>Tue, 31 Mar 2020 17:09:34 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/SQL-to-Dataflux-NOOB/m-p/636299#M19055</guid>
      <dc:creator>Vinz867</dc:creator>
      <dc:date>2020-03-31T17:09:34Z</dc:date>
    </item>
    <item>
      <title>SAS Data Integration Studio - parametrizing job using a .csv file</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/SAS-Data-Integration-Studio-parametrizing-job-using-a-csv-file/m-p/636110#M19050</link>
      <description>&lt;P&gt;Hi!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;How can we create parameters for SAS DI jobs by reading a .csv file. For example, I would like to select all rows with where clause with parameters from the .csv file.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;</description>
      <pubDate>Tue, 31 Mar 2020 09:15:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/SAS-Data-Integration-Studio-parametrizing-job-using-a-csv-file/m-p/636110#M19050</guid>
      <dc:creator>TomekQ</dc:creator>
      <dc:date>2020-03-31T09:15:05Z</dc:date>
    </item>
    <item>
      <title>SAS Data Integration Studio - transforming all tables in a library</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/SAS-Data-Integration-Studio-transforming-all-tables-in-a-library/m-p/636108#M19049</link>
      <description>&lt;P&gt;Hi!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I need to perform a transformation on all tables in a library in SAS DIS. How can I do it? Let's say that I want to delete rows based on a certain condition in every table.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;</description>
      <pubDate>Tue, 31 Mar 2020 07:25:03 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/SAS-Data-Integration-Studio-transforming-all-tables-in-a-library/m-p/636108#M19049</guid>
      <dc:creator>TomekQ</dc:creator>
      <dc:date>2020-03-31T07:25:03Z</dc:date>
    </item>
    <item>
      <title>Convert SAS v 6.12 VAX files to windows</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Convert-SAS-v-6-12-VAX-files-to-windows/m-p/635680#M19039</link>
      <description>&lt;P&gt;I have a clinical trial data from the 1980s that is in SAS v6.12 and was created on the old DEC VAX system (DEC model 4000 MODEL 610) and I want to convert it to be able to be read in Windows. SAS technical support said they don't have any versions of SAS like this anymore. Any suggestions? Thanks.&lt;/P&gt;&lt;P&gt;RL Macdonald, MD&lt;/P&gt;&lt;P&gt;Department of Neurological Surgery&lt;/P&gt;&lt;P&gt;UCSF Fresno&lt;/P&gt;</description>
      <pubDate>Sun, 29 Mar 2020 18:12:46 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Convert-SAS-v-6-12-VAX-files-to-windows/m-p/635680#M19039</guid>
      <dc:creator>rlmacdonald</dc:creator>
      <dc:date>2020-03-29T18:12:46Z</dc:date>
    </item>
    <item>
      <title>Convert Character Date to Numeric</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Convert-Character-Date-to-Numeric/m-p/634447#M19024</link>
      <description>&lt;P&gt;Hi everyone, I am trying to convert a date variable that is stored as a character variable into a numeric variable. However, the date variable is in the format "02 AUG 2017". I want the result outputted as a new date variable that is numeric.&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data want;
  set have;
  date=input(character_date,date10.);
  format date date10.;
run;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Nothing that I have tried has worked. I would appreciate any assistance. Thank you in advance.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 24 Mar 2020 14:56:09 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Convert-Character-Date-to-Numeric/m-p/634447#M19024</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-03-24T14:56:09Z</dc:date>
    </item>
    <item>
      <title>Calculating percent change based on matching columns</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Calculating-percent-change-based-on-matching-columns/m-p/634177#M19015</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data have;
input Subject Type :$12. Date &amp;amp;:anydtdte. Procedure :$12. Measurement;
format date yymmdd10.;
datalines;

500   Initial    15 AUG 2017      Invasive    20 
500   Initial    15 AUG 2017     Surface      35 
500   Followup   15 AUG 2018     Invasive     54 
428   Followup    15 AUG 2018      Outer      29 
765   Seventh     3 AUG 2018      Other       13 
500   Followup    3 JUL 2018      Surface     98 
428   Initial     3 JUL 2017     Outer        10 
765   Initial     20 JUL 2019     Other       19 
610   Third       20 AUG 2019     Invasive    66 
610   Initial     17 Mar 2018     Invasive    17 
;

data want (drop=rc _Measurement);
   if _N_ = 1 then do;
      declare hash h (dataset : "have (rename=(Measurement=_Measurement) where=(Type='Initial'))");
      h.definekey ('Subject');
      h.definedata ('_Measurement');
      h.definedone();
   end;

   set have;
   _Measurement=.;

   if Type ne 'Initial' then rc = h.find();
   NewMeasurement = ifn(Measurement=., ., sum (Measurement, -_Measurement));
run;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;This code currently subtracts the initial (Type column) measurement from the other measurements if two values in the “SUBJECT” column match and two values in the “PROCEDURE” column match. It produces a new column with the calculations. This is the absolute change in measurements.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I want to add to this code so that the value of the latest (date column) measurement taken is divided by the lowest measurement recorded if two values in the “SUBJECT” column match and two values in the “PROCEDURE” column match and the calculation is produced in a new column. This is also known as the percent change from nadir. Thank you in advance!&lt;/P&gt;</description>
      <pubDate>Mon, 23 Mar 2020 17:20:30 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Calculating-percent-change-based-on-matching-columns/m-p/634177#M19015</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-03-23T17:20:30Z</dc:date>
    </item>
    <item>
      <title>SAS Error: Subquery evaluated to more than one row</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/SAS-Error-Subquery-evaluated-to-more-than-one-row/m-p/633416#M19005</link>
      <description>&lt;P&gt;&lt;SPAN&gt;I am getting the message "ERROR: Subquery evaluated to more than one row. I have posted the working code below. I would like to know how this error can be resolved while still having the program run as intended&lt;/SPAN&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data have;
input Subject Type :$12. Date &amp;amp;:anydtdte. Procedure :$12. Measurement;
format date yymmdd10.;
datalines;

500   Initial    15 AUG 2017      Invasive     20 
500   Initial    18 SEPT 2018     Surface      35 
500   Followup   12 SEPT 2018     Invasive     54 
428   Followup    2 JUL 2019      Outer        29 
765   Seventh     3 JUL 2018      Other        13 
500   Followup    6 NOV 2018      Surface      98 
428   Initial     23 FEB 2018     Outer        10 
765   Initial     20 AUG 2019     Other        19 
610   Third       21 AUG 2018     Invasive     66 
610   Initial     27 Mar 2018     Invasive     17 
999   Dummy       17 mar 2020     Some          1
999   Dummy       18 mar 2020     Some          2
999   Dummy       19 mar 2020     Some          3
;

proc sql;
create table want as
select *,
    (select max(measurement) 
     from have 
     where subject=a.subject and type=a.type and procedure=a.procedure 
     having date = max(date)) / min(measurement) as ratio
from have as a
group by subject, type, procedure
order by subject, date;
quit;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;SPAN&gt;. Thank you in advance.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 19 Mar 2020 20:16:49 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/SAS-Error-Subquery-evaluated-to-more-than-one-row/m-p/633416#M19005</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-03-19T20:16:49Z</dc:date>
    </item>
    <item>
      <title>Working with a date column corresponding to values in another</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Working-with-a-date-column-corresponding-to-values-in-another/m-p/633328#M19002</link>
      <description>&lt;P&gt;Hi, I would like to add on to this code so that t&lt;SPAN&gt;he value of the last measurement according to the "date" column&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;in the "measurement" column is divided by the&amp;nbsp;lowest&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;value recorded in the measurement column and the result forms a new column. The current working code adds a column that subtracts the initial measurements from the other measurements in the "measurement" column if the "subject", "type" and "procedure" columns match.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data have;
input Subject Type Date $ 5-12 Procedure $ 15-22 Measurement;
datalines;
500   Initial    15 AUG 2017      Invasive     20 
500   Initial    18 SEPT 2018     Surface      35 
500   Followup   12 SEPT 2018   Invasive   54 
428   Followup    2 JUL 2019      Outer        29 
765   Seventh     3 JUL 2018      Other        13 
500   Followup    6 NOV 2018     Surface     98 
428   Initial     23 FEB 2018        Outer        10 
765   Initial     20 AUG 2019        Other        19 
610   Third       21 AUG 2018      Invasive    66 
610 Initial       27 Mar   2018       Invasive     17 
;
data want (drop=rc _Measurement);
   if _N_ = 1 then do;
      declare hash h (dataset : "have (rename=(Measurement=_Measurement) where=(Type='Initial'))");
      h.definekey ('Subject');
      h.definedata ('_Measurement');
      h.definedone();
   end;

   set have;
   _Measurement=.;

   if Type ne 'Initial' then rc = h.find();
   NewMeasurement = ifn(Measurement=., ., sum (Measurement, -_Measurement));
run;&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Thu, 19 Mar 2020 15:52:14 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Working-with-a-date-column-corresponding-to-values-in-another/m-p/633328#M19002</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-03-19T15:52:14Z</dc:date>
    </item>
    <item>
      <title>Add new column with updates reflected without affecting missing data</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Add-new-column-with-updates-reflected-without-affecting-missing/m-p/633088#M18998</link>
      <description>&lt;PRE class="lang-sql prettyprint prettyprinted"&gt;&lt;CODE&gt;&lt;SPAN class="pln"&gt;Hi everyone, below is my working program that I would like to update.&lt;BR /&gt;&lt;BR /&gt;data have&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;;&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;input Subject Type &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;$&lt;/SPAN&gt; &lt;SPAN class="lit"&gt;5-12&lt;/SPAN&gt; &lt;SPAN class="kwd"&gt;Procedure&lt;/SPAN&gt; &lt;SPAN class="pun"&gt;$&lt;/SPAN&gt; &lt;SPAN class="lit"&gt;15-22&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Measurement&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;;&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;datalines&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;;&lt;/SPAN&gt;
&lt;SPAN class="lit"&gt;500&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Initial   Invasive  &lt;/SPAN&gt;&lt;SPAN class="lit"&gt;20&lt;/SPAN&gt; 
&lt;SPAN class="lit"&gt;500&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Initial   Surface   &lt;/SPAN&gt;&lt;SPAN class="lit"&gt;35&lt;/SPAN&gt; 
&lt;SPAN class="lit"&gt;500&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Followup  Invasive  &lt;/SPAN&gt;&lt;SPAN class="lit"&gt;54&lt;/SPAN&gt; 
&lt;SPAN class="lit"&gt;428&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Followup  &lt;/SPAN&gt;&lt;SPAN class="kwd"&gt;Outer&lt;/SPAN&gt;     &lt;SPAN class="lit"&gt;29&lt;/SPAN&gt; 
&lt;SPAN class="lit"&gt;765&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Seventh   Other     &lt;/SPAN&gt;&lt;SPAN class="lit"&gt;13&lt;/SPAN&gt; 
&lt;SPAN class="lit"&gt;500&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Followup  Surface   &lt;/SPAN&gt;&lt;SPAN class="lit"&gt;98&lt;/SPAN&gt; 
&lt;SPAN class="lit"&gt;428&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Initial   &lt;/SPAN&gt;&lt;SPAN class="kwd"&gt;Outer&lt;/SPAN&gt;     &lt;SPAN class="lit"&gt;10&lt;/SPAN&gt; 
&lt;SPAN class="lit"&gt;765&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Initial   Other     &lt;/SPAN&gt;&lt;SPAN class="lit"&gt;19&lt;/SPAN&gt; 
&lt;SPAN class="lit"&gt;610&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Third     Invasive  &lt;/SPAN&gt;&lt;SPAN class="lit"&gt;66&lt;/SPAN&gt; 
&lt;SPAN class="lit"&gt;610&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Initial   Invasive  &lt;/SPAN&gt;&lt;SPAN class="lit"&gt;17&lt;/SPAN&gt; 
&lt;SPAN class="pun"&gt;;&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;
data want &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;(&lt;/SPAN&gt;&lt;SPAN class="kwd"&gt;drop&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;=&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;rc _Measurement&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;);&lt;/SPAN&gt;
   &lt;SPAN class="kwd"&gt;if&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; _N_ &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;=&lt;/SPAN&gt; &lt;SPAN class="lit"&gt;1&lt;/SPAN&gt; &lt;SPAN class="kwd"&gt;then&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; do&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;;&lt;/SPAN&gt;
   &lt;SPAN class="kwd"&gt;declare&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; hash h &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;(&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;dataset &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;:&lt;/SPAN&gt; &lt;SPAN class="str"&gt;"have (rename=(Measurement=_Measurement) where=(Type='Initial'))"&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;);&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;      h&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;.&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;definekey &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;(&lt;/SPAN&gt;&lt;SPAN class="str"&gt;'Subject'&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;);&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;      h&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;.&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;definedata &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;(&lt;/SPAN&gt;&lt;SPAN class="str"&gt;'_Measurement'&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;);&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;      h&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;.&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;definedone&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;();&lt;/SPAN&gt;
   &lt;SPAN class="kwd"&gt;end&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;;&lt;/SPAN&gt;

   &lt;SPAN class="kwd"&gt;set&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; have&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;;&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;   _Measurement&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;=.;&lt;/SPAN&gt;

   &lt;SPAN class="kwd"&gt;if&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; Type ne &lt;/SPAN&gt;&lt;SPAN class="str"&gt;'Initial'&lt;/SPAN&gt; &lt;SPAN class="kwd"&gt;then&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; rc &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;=&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; h&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;.&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;find&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;();&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;   Measurement &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;=&lt;/SPAN&gt;&lt;SPAN class="pln"&gt; sum &lt;/SPAN&gt;&lt;SPAN class="pun"&gt;(&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;Measurement&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;,&lt;/SPAN&gt; &lt;SPAN class="pun"&gt;-&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;_Measurement&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;);&lt;/SPAN&gt;&lt;SPAN class="pln"&gt;run&lt;/SPAN&gt;&lt;SPAN class="pun"&gt;;&lt;/SPAN&gt;&lt;/CODE&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;The goal is to subtract the numbers in the "MEASUREMENT" column based on the "SUBJECT", &lt;BR /&gt;"TYPE" and “PROCEDURE” columns. If two values in the “SUBJECT” column match and two values &lt;BR /&gt;in the “PROCEDURE” column match, then the initial measurement should be subtracted from the &lt;BR /&gt;other measurement. For example, the initial measurement in row 1 (20) should be subtracted from &lt;BR /&gt;the followup measurement in row 3 (54) because the subject (500) and procedure (Invasive) match.&lt;BR /&gt;Furthermore, the initial measurement in row 8 (19) should be subtracted from the seventh &lt;BR /&gt;measurement in row 5 (13) because the subject (765) and procedure (Other) match. &lt;BR /&gt;The result should form the "OUTPUT" column. &lt;BR /&gt;&lt;BR /&gt;How can I account for missing values in the "measurement" column and &lt;BR /&gt;keep them without being affected in the "output" column?? &lt;BR /&gt;In addition, I noticed that this program updates the "measurement" column. &lt;BR /&gt;How can I have the updates be created in a new column called "output" while keeping the "measurement" column?&lt;BR /&gt;&lt;/SPAN&gt;&lt;/PRE&gt;</description>
      <pubDate>Wed, 18 Mar 2020 21:22:12 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Add-new-column-with-updates-reflected-without-affecting-missing/m-p/633088#M18998</guid>
      <dc:creator>AshJuri</dc:creator>
      <dc:date>2020-03-18T21:22:12Z</dc:date>
    </item>
    <item>
      <title>Connecting to two databases on same SQL server via UNIX SAS</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Connecting-to-two-databases-on-same-SQL-server-via-UNIX-SAS/m-p/633027#M18996</link>
      <description>&lt;P&gt;Is it possible to define one DSN in the odbc.ini file on SAS UNIX and use it for multiple databases on that host?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We have two databases that are on the same host. One of the databases was previously defined in the odbc.ini file on our SAS UNIX server and was working fine, but now there is a second database on the same server and someone wants to connect to it via SAS. I was hoping that we could simply change the DSN definition to change the DATABASE=abc to DATABASE=&amp;nbsp; &amp;nbsp;and then use the same DSN in LIBNAME statements for both databases by adding DATABASE=whatever to the libname statement, rather than having to define two different DSNs for the same host.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;FONT color="#0000FF"&gt;Works:&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;The DSN in the odbc.ini file is IP and the following LIBNAME statement &lt;FONT color="#000000"&gt;works&lt;/FONT&gt; if we have DATABASE=IP in the DSN definition in the odbc.ini file:&amp;nbsp;&amp;nbsp;&lt;FONT color="#0000FF"&gt;LIBNAME IP SQLSVR&amp;nbsp; &lt;EM&gt;Datasrc=IP&lt;/EM&gt; USER=IPME&amp;nbsp; PASSWORD=XXXX ;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;FONT color="#FF0000"&gt;Fails:&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;If we change to DATABASE=&amp;nbsp; in the DSN definition and try to connect to the two databases by adding DATABASE=[db] in the LIBNAME statement, both LIBNAME statements fail. These are the &lt;FONT color="#000000"&gt;errors &lt;/FONT&gt;we get - different error for each LIBNAME statement:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;LIBNAME IP SQLSVR&amp;nbsp; &lt;EM&gt;Datasrc=IP&lt;/EM&gt; &lt;EM&gt;Database=IP&lt;/EM&gt; USER=IPME&amp;nbsp;PASSWORD=XXXX ;&lt;/P&gt;
&lt;P&gt;&lt;FONT color="#FF0000"&gt;ERROR: CLI error trying to establish connection: [SAS][ODBC 20101 driver][20101]Login failed for user 'IPME'.&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;LIBNAME OCM SQLSVR&amp;nbsp; &lt;EM&gt;Datasrc=IP&lt;/EM&gt; &lt;EM&gt;Database=OCM&lt;/EM&gt; SCHEMA=dbo USER=ocmme&amp;nbsp; PASSWORD=XXXX;&lt;/P&gt;
&lt;P&gt;&lt;FONT color="#FF0000"&gt;ERROR: CLI error trying to establish connection: [DataDirect][ODBC lib] Driver Manager Message file not found. Please check for the&amp;nbsp;value of InstallDir in your odbc.ini.&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;It seems like it should be possible to not have to create a new DSN entry for every database if they are on the same host, but perhaps not, since I haven't been able to find much helpful online about this and our testing thus far has failed. I'm hoping someone has some sparkling insight into making this work with only one DSN entry. If not, or it's just not possible on UNIX, then we will simply create a second DSN entry for the second database.&lt;/P&gt;</description>
      <pubDate>Wed, 18 Mar 2020 17:21:11 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Connecting-to-two-databases-on-same-SQL-server-via-UNIX-SAS/m-p/633027#M18996</guid>
      <dc:creator>TBarker</dc:creator>
      <dc:date>2020-03-18T17:21:11Z</dc:date>
    </item>
    <item>
      <title>Vendor reconciliation</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Vendor-reconciliation/m-p/632327#M18973</link>
      <description>Can anyone help me with vendor reconciliation listings in clinical trail data management . With code especially.</description>
      <pubDate>Mon, 16 Mar 2020 04:59:15 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Vendor-reconciliation/m-p/632327#M18973</guid>
      <dc:creator>Samyuktha1</dc:creator>
      <dc:date>2020-03-16T04:59:15Z</dc:date>
    </item>
    <item>
      <title>Need help in crrecting  a job</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Need-help-in-crrecting-a-job/m-p/632206#M18967</link>
      <description>&lt;P&gt;Hi ,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am bit new to sas DI Studio.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In our company , we develop jobs in sas DI and promote to MainFrame.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have to change a job in sas DI , When I compared the SAS DI&amp;nbsp;code and Mainframes code ,it is not matching at all.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Seems like somebody has messed up the job in DI(Mainframe code is not messed up)&amp;nbsp;.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As the job is not reliable, I created a new job.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When I compared the new job developed in sas DI(leaving the change)&amp;nbsp;and the existing job ( in Mainframes)&amp;nbsp;they both are not matching at all.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Need help in understanding the changes between the 2 codes.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Psrskwm3.sas is the ---Mainframe code&lt;/P&gt;&lt;P&gt;the later is the new sas DI job(in the new job the only change should be&lt;/P&gt;&lt;P&gt;&lt;FONT color="#0000ff" face="Courier New" size="2"&gt;LIBNAME&lt;/FONT&gt; &lt;FONT color="#0000ff" face="Courier New" size="2"&gt;access&lt;/FONT&gt; &lt;FONT color="#0000ff" face="Courier New" size="2"&gt;V8&lt;/FONT&gt; &lt;FONT color="#800080" face="Courier New" size="2"&gt;"dwgip.dwh.views.full"&lt;/FONT&gt; &lt;FONT color="#0000ff" face="Courier New" size="2"&gt;disp&lt;/FONT&gt;&lt;FONT face="Courier New" size="2"&gt;=shr ;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT face="Courier New" size="2"&gt;where as in Mainframe it shoul be &lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#0000ff" face="Courier New" size="2"&gt;LIBNAME&lt;/FONT&gt; &lt;FONT color="#0000ff" face="Courier New" size="2"&gt;access&lt;/FONT&gt; &lt;FONT color="#0000ff" face="Courier New" size="2"&gt;V8&lt;/FONT&gt; &lt;FONT color="#800080" face="Courier New" size="2"&gt;"dwgip.dwh.views"&lt;/FONT&gt; &lt;FONT color="#0000ff" face="Courier New" size="2"&gt;disp&lt;/FONT&gt;&lt;FONT face="Courier New" size="2"&gt;=shr ;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 15 Mar 2020 00:28:42 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Need-help-in-crrecting-a-job/m-p/632206#M18967</guid>
      <dc:creator>swathiprasad</dc:creator>
      <dc:date>2020-03-15T00:28:42Z</dc:date>
    </item>
    <item>
      <title>Conditional Logic in Data Integration Studio</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Conditional-Logic-in-Data-Integration-Studio/m-p/632069#M18960</link>
      <description>&lt;P&gt;Is there a way to conditionally run a transformation in Data Integration Studio? I am working on automating overnight edit checks. However, I do not want to spam recipients if there are no edit records in a table. I am using the Publish to Email transformation to send data in a table. I see some objects in Data Integration Studio that look like there for conditional logic but I do not know how they work. I am using Data Integration Studio Version=4.902.&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2020 20:46:31 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Conditional-Logic-in-Data-Integration-Studio/m-p/632069#M18960</guid>
      <dc:creator>DavidPhillips2</dc:creator>
      <dc:date>2020-03-13T20:46:31Z</dc:date>
    </item>
    <item>
      <title>Indicator variable that tracks the year before a variables changes and the year it changes</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Indicator-variable-that-tracks-the-year-before-a-variables/m-p/631994#M18958</link>
      <description>&lt;P&gt;Trying to create an indicator variable that tracks the year before a different variable changes and the year it changes by creating a dummy = 1. See example.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE border="0" cellspacing="0" cellpadding="0"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;1999&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2000&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2001&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2002&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2003&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2004&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2005&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2006&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Want&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE border="0" cellspacing="0" cellpadding="0"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;1999&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2000&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2001&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2002&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2003&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2004&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2005&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;0&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2006&lt;/TD&gt;&lt;TD&gt;222&lt;/TD&gt;&lt;TD&gt;ABC CO&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Your help would be very much appreciated!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2020 17:45:41 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Indicator-variable-that-tracks-the-year-before-a-variables/m-p/631994#M18958</guid>
      <dc:creator>r4321</dc:creator>
      <dc:date>2020-03-13T17:45:41Z</dc:date>
    </item>
    <item>
      <title>Count months from date to the end of the year</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Count-months-from-date-to-the-end-of-the-year/m-p/631937#M18954</link>
      <description>&lt;P&gt;Hi, I'd like to see a number of months left from the date to the end of the year.&lt;/P&gt;&lt;P&gt;I know to count month I can use intck('month', 'start', 'end', 'D') but how to get the "end" date when it's not given. I dont know how to pass the 'end' argument. Only what I got is the start date.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I would like to see:&lt;/P&gt;&lt;P&gt;date&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;months&lt;/P&gt;&lt;P&gt;01JUN2019&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 7&lt;/P&gt;&lt;P&gt;01APR2020&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 9&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Have a good day &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2020 15:52:22 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Count-months-from-date-to-the-end-of-the-year/m-p/631937#M18954</guid>
      <dc:creator>PatrykSAS</dc:creator>
      <dc:date>2020-03-13T15:52:22Z</dc:date>
    </item>
    <item>
      <title>SQL Passthrough DB2 - WITH clause problem</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/SQL-Passthrough-DB2-WITH-clause-problem/m-p/631678#M18943</link>
      <description>&lt;P&gt;I cannot get this query to run.&amp;nbsp; I've tried a lot of variations on this query - mostly trying either the WITH clause or an inline query.&amp;nbsp; I can use the SQL passthrough to DB2 with simpler queries, so I've left it out of the examples.&amp;nbsp; I've googled the error message, DB2 "WITH Clause", "common table expression", "CTE", and various combinations, and none of the dozens of results had anything that helps me resolve the issue.&amp;nbsp; I've also tried the inner query alone - that works.&amp;nbsp; I've tried these variants in another query tool, and have gotten the same results.&amp;nbsp; I know that means it's not a SAS problem, but I'm hoping someone can help, or point me to any useful documentation.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;WITH clause VERSION:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp; WITH inner_query AS&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; (SELECT&lt;FONT&gt; oc.clm_cd, p.prod_ser_no, p.bld_date, p.vin,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; p.in_srvc_date, p.in_srvc_trk_mlg,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; year(p.in_srvc_date) as in_service_year,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; case when p.prod_type_cd = 'ENGINE'&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; then p.prod_ser_no else '' end as eng_serial_number,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; case when p.prod_type_cd = 'ENGINE'&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; then p.prod_mdl_cd else '' end as eng_model,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; p.prod_mdl_cd, p.voc_cd,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; p.prod_make_cd, p.prod_mdl_cd as fllc_internal_model,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; p.prod_fam_cd, p.chass_mdl_cd, oc.flt_cd,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; oc.flt_name, oc.fail_date, oc.trk_mlg_amt,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; case when cfa.new_maj_comp_ser is not null&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; then oc.trk_mlg_amt else 0 end as eng_repl_mileage,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; case when cfa.new_maj_comp_ser is not null&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; then oc.fail_date else '12-31-2999' end as eng_repl_date&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; FROM wty_ddc.claims_fact_ddc_v oc&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; INNER JOIN wty_ddc.product_dim_v p&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; ON oc.prod_id = p.prod_id&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; INNER JOIN wty_ddc.claims_fact_attr_ddc_v cfa&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; ON cfa.clm_id = oc.clm_id&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; WHERE trk_mlg_amt is not null and trk_mlg_amt &amp;gt; 0&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; and prod_type_cd in ('ENGINE','CHASSIS') &lt;/FONT&gt;)&lt;/P&gt;&lt;P&gt;&lt;FONT&gt;&amp;nbsp;&amp;nbsp; SELECT&amp;nbsp; clm_cd, prod_ser_no, bld_date, vin,&amp;nbsp; /*POST EDITED TO REMOVE TABLE ALIAS Q. from clm_cd */&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; in_srvc_date, in_srvc_trk_mlg,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; in_service_year,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; prod_mdl_cd, voc_cd,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; prod_make_cd, fllc_internal_model,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; prod_fam_cd, chass_mdl_cd, flt_cd,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; flt_name, fail_date, trk_mlg_amt,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; eng_repl_mileage,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; eng_repl_date,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; max(eng_serial_number) OVER PARTITION BY (vin) as eng_serial_number,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; max(eng_model) OVER PARTITION BY (vin) as eng_model&lt;BR /&gt;&amp;nbsp; &amp;nbsp;&amp;nbsp; FROM inner_query&lt;BR /&gt;&amp;nbsp; &amp;nbsp;&amp;nbsp; ORDER BY vin, fail_date desc;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT&gt;ERROR: CLI describe error: [IBM][CLI Driver][DB2/AIX64] SQL0104N&amp;nbsp; An unexpected token "clm_cd, prod_ser_no,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; bld_date, vin, in_sr" was found following "il_date desc) select".&amp;nbsp; Expected tokens may include:&amp;nbsp;&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; "&amp;lt;space&amp;gt;".&amp;nbsp; SQLSTATE=42601&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;(My interpretation is that it objects to the outer query.&amp;nbsp; If I remove the columns list with "select * from inner_query" leaving the order by clause, THIS DOES WORK.&amp;nbsp; But there's no need for the WITH clause in that case - it's superfluous.&amp;nbsp; Anyway, part of the work I want to do here requires that PARTITION OVER feature.)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;INLINE QUERY VERSION:&lt;/P&gt;&lt;P&gt;&lt;FONT&gt;&amp;nbsp; &amp;nbsp; SELECT clm_cd, prod_ser_no, bld_date, vin, &amp;nbsp;&amp;nbsp;&lt;SPAN style="display: inline !important; float: none; background-color: #ffffff; color: #333333; font-family: 'HelevticaNeue-light','Helvetica Neue',Helvetica,Arial,sans-serif; font-size: 14px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: normal; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px;"&gt;/*POST EDITED TO REMOVE TABLE ALIAS Q. from clm_cd */&lt;/SPAN&gt;&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; in_srvc_date, in_srvc_trk_mlg,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; in_service_year,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; prod_mdl_cd, voc_cd,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; prod_make_cd, fllc_internal_model,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; prod_fam_cd, chass_mdl_cd, flt_cd,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; flt_name, fail_date, trk_mlg_amt,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; eng_repl_mileage,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; eng_repl_date,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; max(eng_serial_number) OVER PARTITION BY (vin) as eng_serial_number,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; max(eng_model) OVER PARTITION BY (vin) as eng_model&lt;BR /&gt;&amp;nbsp; &amp;nbsp;&amp;nbsp; FROM (SELECT oc.clm_cd, p.prod_ser_no, p.bld_date, p.vin,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; p.in_srvc_date, p.in_srvc_trk_mlg,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; year(p.in_srvc_date) as in_service_year,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; case when p.prod_type_cd = 'ENGINE'&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; then p.prod_ser_no else '' end as eng_serial_number,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;FONT&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; &lt;/FONT&gt;case when p.prod_type_cd = 'ENGINE'&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;FONT&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/FONT&gt; then p.prod_mdl_cd else '' end as eng_model,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;FONT&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; &lt;/FONT&gt;p.prod_mdl_cd, p.voc_cd,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;FONT&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; &lt;/FONT&gt;p.prod_make_cd, p.prod_mdl_cd as fllc_internal_model,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; p.prod_fam_cd, p.chass_mdl_cd, oc.flt_cd,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; oc.flt_name, oc.fail_date, oc.trk_mlg_amt,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; case when cfa.new_maj_comp_ser is not null&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; then oc.trk_mlg_amt else 0 end as eng_repl_mileage,&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; case when cfa.new_maj_comp_ser is not null&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; then oc.fail_date else '12-31-2999' end as eng_repl_date&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; FROM wty_ddc.claims_fact_ddc_v oc&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; INNER JOIN wty_ddc.product_dim_v p&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; ON oc.prod_id = p.prod_id&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; INNER JOIN wty_ddc.claims_fact_attr_ddc_v cfa&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; ON cfa.clm_id = oc.clm_id&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; WHERE trk_mlg_amt is not null and trk_mlg_amt &amp;gt; 0&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; and prod_type_cd in ('ENGINE','CHASSIS')&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp; ORDER BY vin, fail_date desc)&amp;nbsp;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT&gt;ERROR: CLI describe error: [IBM][CLI Driver][DB2/AIX64] SQL0104N&amp;nbsp; An unexpected token "q.clm_cd, prod_ser_no,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; bld_date, vin, in_srvc_date, in" was found following "SELECT ".&amp;nbsp; Expected tokens may include:&amp;nbsp;&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; "&amp;lt;space&amp;gt;".&amp;nbsp; SQLSTATE=42601&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;(Same interpretation here - I think the query would work if the outer select statement was just "select * from....", but that's useless.)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have tried the same thing leaving out the two column specs using PARTITION BY, in case that was ultimately the source of the syntax problem.&amp;nbsp; However, this error code ultimately means the same thing as the earlier one:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT&gt;ERROR: CLI describe error: [IBM][CLI Driver][DB2/AIX64] SQL0206N&amp;nbsp; "Q.CLM_CD" is not valid in the context&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; where it is used.&amp;nbsp; SQLSTATE=42703&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT&gt;&amp;gt;&amp;gt;&amp;gt;&amp;nbsp; I think I could get the work done with several PROC SQL steps creating local temporary tables, but that will make this run much, much slower.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT&gt;TIA!!&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT&gt;Steve&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 12 Mar 2020 19:45:48 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/SQL-Passthrough-DB2-WITH-clause-problem/m-p/631678#M18943</guid>
      <dc:creator>saraimi</dc:creator>
      <dc:date>2020-03-12T19:45:48Z</dc:date>
    </item>
    <item>
      <title>ODBC Connection Materialized vs Table in Library</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/ODBC-Connection-Materialized-vs-Table-in-Library/m-p/631532#M18938</link>
      <description>&lt;P&gt;I am connecting to an Oracle DB via tan ODBC driver in SAS.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;LIBNAME tables
ODBC
USER=USER
PASSWORD=*****
DSN="Name"
SCHEMA=SCEMA;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;The Oracle database has some tables replicated as a materialized view to speed up processing for some users. When I connect with SAS does the library show a copy of the table, or the materialized view of the table? I would prefer not to use the materialized version as it updates less frequently than I need it.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 12 Mar 2020 12:27:58 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/ODBC-Connection-Materialized-vs-Table-in-Library/m-p/631532#M18938</guid>
      <dc:creator>MB_Analyst</dc:creator>
      <dc:date>2020-03-12T12:27:58Z</dc:date>
    </item>
    <item>
      <title>Set to missing after a specific value</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Set-to-missing-after-a-specific-value/m-p/631419#M18927</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I want to create a dataset that looks like data "want" below, for id 1-100. This is for an exercise so the values don't matter, but the data structure needs to follow these rules:&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;6 rows (time 0-5) for every id.&lt;/LI&gt;&lt;LI&gt;For each id, once y=1, then next rows will be missing.&amp;nbsp;&lt;/LI&gt;&lt;LI&gt;For each id, y=0 for rows before y=1.&amp;nbsp;&lt;/LI&gt;&lt;LI&gt;20% of the ids have y=1.&amp;nbsp;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;I have tried a few approaches, but I am running into issues. For example, this attempt goes from wide to long. Within each id, how do I set values after 1 to missing?&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks in advance!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data sim1; 
	do id=1 to 100;
	year0=0; 
	year1=1;
	year2=2;
	year3=3;
	year4=4;
	year5=5;
	output; end;
run;

proc transpose data=sim1 out=sim2; 
	by id;  
run;

data sim3; 
	set sim2 (rename=(COL1=year)); 
		y=rand("BINOMIAL", 0.2, 1);
	drop _name_;
run; &lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data want; 
	input id y time; 
	datalines; 
		1 1 0
		1 . 1
		1 . 2
		1 . 3
		1 . 4
		1 . 5
		2 0 0
		2 0 1
		2 0 2
		2 1 3
		2 . 4
		2 . 5
		;
run;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 11 Mar 2020 23:54:10 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Set-to-missing-after-a-specific-value/m-p/631419#M18927</guid>
      <dc:creator>silango</dc:creator>
      <dc:date>2020-03-11T23:54:10Z</dc:date>
    </item>
    <item>
      <title>How to flag observations if all values of a variable within a by group are the same</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-flag-observations-if-all-values-of-a-variable-within-a-by/m-p/631383#M18921</link>
      <description>&lt;P&gt;Hi there,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;I have a dataset that looks like this&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ID&amp;nbsp;&amp;nbsp;&amp;nbsp; varA&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;1&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 0&lt;/P&gt;&lt;P&gt;1&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 0&lt;/P&gt;&lt;P&gt;2&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/P&gt;&lt;P&gt;2&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/P&gt;&lt;P&gt;3&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 0&lt;/P&gt;&lt;P&gt;3&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I want it to look like this (i.e. flag=1 if all values of varA are the same within ID by group):&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ID&amp;nbsp;&amp;nbsp;&amp;nbsp; varA&amp;nbsp;&amp;nbsp; flag&lt;/P&gt;&lt;P&gt;1&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 0&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/P&gt;&lt;P&gt;1&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 0&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/P&gt;&lt;P&gt;2&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/P&gt;&lt;P&gt;2&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&lt;/P&gt;&lt;P&gt;3&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 0&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 0&lt;/P&gt;&lt;P&gt;3&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 1&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 0&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;How would I do this? Thanks&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 11 Mar 2020 21:25:21 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-flag-observations-if-all-values-of-a-variable-within-a-by/m-p/631383#M18921</guid>
      <dc:creator>KPCklebspn</dc:creator>
      <dc:date>2020-03-11T21:25:21Z</dc:date>
    </item>
    <item>
      <title>Data Process runs in different times</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Data-Process-runs-in-different-times/m-p/631210#M18919</link>
      <description>&lt;P&gt;Hi, Please could somebody help me... I need to do a process job (in Data Flux) with three data jobs linked, so that when I start the Process job (manually), the 1 and 2 data jobs run and the third data job wait 10 minutes to start since the process starts... is possible?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="example_time_DF.png" style="width: 600px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/36756iDB2689A101C65D58/image-size/large?v=1.0&amp;amp;px=999" title="example_time_DF.png" alt="example_time_DF.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 11 Mar 2020 14:09:01 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Data-Process-runs-in-different-times/m-p/631210#M18919</guid>
      <dc:creator>MBVilela</dc:creator>
      <dc:date>2020-03-11T14:09:01Z</dc:date>
    </item>
    <item>
      <title>How to sum up the valid responses from some variables ignore the missing values</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-sum-up-the-valid-responses-from-some-variables-ignore-the/m-p/631060#M18907</link>
      <description>&lt;P&gt;Hi SAS Pros,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am have a data like this:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Have:&lt;/P&gt;
&lt;TABLE style="border-collapse: collapse; width: 288pt;" border="0" width="384" cellspacing="0" cellpadding="0"&gt;
&lt;TBODY&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="64" height="20" style="height: 15.0pt; width: 48pt;"&gt;Var1&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var2&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var3&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var4&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var5&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var6&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD height="20" align="right" style="height: 15.0pt;"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;0&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD height="20" align="right" style="height: 15.0pt;"&gt;0&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD height="20" align="right" style="height: 15.0pt;"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;0&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD height="20" align="right" style="height: 15.0pt;"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;4&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I want to&amp;nbsp;sum up the valid responses from these variables and ignore the missing values.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Want:&lt;/P&gt;
&lt;TABLE style="border-collapse: collapse; width: 336pt;" border="0" width="448" cellspacing="0" cellpadding="0"&gt;
&lt;TBODY&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD width="64" height="20" style="height: 15.0pt; width: 48pt;"&gt;Var1&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var2&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var3&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var4&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var5&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Var6&lt;/TD&gt;
&lt;TD width="64" style="width: 48pt;"&gt;Total&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD height="20" align="right" style="height: 15.0pt;"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;0&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD align="right"&gt;3&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD height="20" align="right" style="height: 15.0pt;"&gt;0&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;4&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD height="20" align="right" style="height: 15.0pt;"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;0&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD&gt;.&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;6&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 15.0pt;"&gt;
&lt;TD height="20" align="right" style="height: 15.0pt;"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;1&lt;/TD&gt;
&lt;TD align="right"&gt;2&lt;/TD&gt;
&lt;TD align="right"&gt;4&lt;/TD&gt;
&lt;TD align="right"&gt;11&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you in advance for any help!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Best regards,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;C&lt;/P&gt;</description>
      <pubDate>Tue, 10 Mar 2020 18:46:46 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-sum-up-the-valid-responses-from-some-variables-ignore-the/m-p/631060#M18907</guid>
      <dc:creator>CynthiaWei</dc:creator>
      <dc:date>2020-03-10T18:46:46Z</dc:date>
    </item>
    <item>
      <title>How to read SAS DIS Jobs Metadata</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-read-SAS-DIS-Jobs-Metadata/m-p/630787#M18903</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have several DI jobs each taking in different source tables.&lt;/P&gt;&lt;P&gt;e.g:&lt;/P&gt;&lt;P&gt;1) job 1 - table 1, table 2&lt;/P&gt;&lt;P&gt;2) job 2 - table 1, table 3, table 4&lt;/P&gt;&lt;P&gt;3) job 3 - table 4, table 5&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is there anyway i can extract these information into a SAS dataset in the following format?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;e.g.:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;job_names source_tables&lt;BR /&gt;job 1&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;table 1&lt;BR /&gt;job 1&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;table 2&lt;BR /&gt;job 2 &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; table 1&lt;BR /&gt;job 2 &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; table 3&lt;BR /&gt;job 2 &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; table 4&lt;BR /&gt;job 3&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;table 4&lt;BR /&gt;job 3&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;table 5&lt;/P&gt;</description>
      <pubDate>Tue, 10 Mar 2020 01:54:22 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-read-SAS-DIS-Jobs-Metadata/m-p/630787#M18903</guid>
      <dc:creator>asd3</dc:creator>
      <dc:date>2020-03-10T01:54:22Z</dc:date>
    </item>
    <item>
      <title>datepart() returns zero</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/datepart-returns-zero/m-p/630737#M18898</link>
      <description>&lt;P&gt;I'm working with a datetime variable, but I want to reduce it to at least a day/month/year variable, or preferably monthyear, so that I can then select only distinct cases of the reduced format.&amp;nbsp; Basically, I want to recode the dates into season/year variables for many observations.&amp;nbsp; However, the datepart() function doesn't seem to be working for me.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The code I'm using is this, where raisedate is the original date variable:&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;data want; data have;&lt;/P&gt;&lt;P&gt;datetime = raisedate;&lt;BR /&gt;format datetime dateampm.;&lt;BR /&gt;datejulian = raisedate;&lt;BR /&gt;format datejulian julian.;&lt;BR /&gt;date = datepart(raisedate);&lt;BR /&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Produces this:&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;id raisedate datetime datejulian datepart&lt;/P&gt;&lt;P&gt;1 21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;2 27APR2017 01JAN60:05:48:56 AM 17117 0&lt;BR /&gt;3 21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;4 21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;5 21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;6 21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;7 21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;8 21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;9 12MAR2019 01JAN60:06:00:20 AM 19071 0&lt;BR /&gt;10 26APR2019 01JAN60:06:01:05 AM 19116 0&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So, for some reason I can reformat the date time, but the datepart() function isn't working.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Perhaps related to this issue, is that if I select distinct on the raisedate variable (i.e. without an id variable), I still get multiple indistinct entries, such as from above, even with the various formats broken out:&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;BR /&gt;21AUG2014 01JAN60:05:32:36 AM 14233 0&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Clearly, they are not obviously distinct depending on format.&amp;nbsp; I've ran into issues with lack of rounding hidden from user view before in SAS, but I'm not sure if that's the issue here, or how I would "round" a date to resolve it.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any recommendations for what to do?&amp;nbsp; Many thanks!&lt;/P&gt;</description>
      <pubDate>Mon, 09 Mar 2020 20:26:33 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/datepart-returns-zero/m-p/630737#M18898</guid>
      <dc:creator>tellmeaboutityo</dc:creator>
      <dc:date>2020-03-09T20:26:33Z</dc:date>
    </item>
    <item>
      <title>Remove duplicates with higher values within a variable</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Remove-duplicates-with-higher-values-within-a-variable/m-p/630625#M18890</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I want to remove duplicates within same by group which have higher value of time difference variable.&lt;/P&gt;&lt;P&gt;The current data:&lt;/P&gt;&lt;P&gt;Obs ID&amp;nbsp; &amp;nbsp; date1&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;time1&amp;nbsp; &amp;nbsp;date2&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;time2&amp;nbsp; &amp;nbsp; &amp;nbsp;TimeDifference&lt;/P&gt;&lt;P&gt;1&amp;nbsp; &amp;nbsp; &amp;nbsp; 1&amp;nbsp; &amp;nbsp; 03/08/20&amp;nbsp; &amp;nbsp; &amp;nbsp;11:00&amp;nbsp; &amp;nbsp;03/08/20&amp;nbsp; &amp;nbsp;10:55&amp;nbsp; &amp;nbsp; &amp;nbsp; 5&lt;/P&gt;&lt;P&gt;2&amp;nbsp; &amp;nbsp; &amp;nbsp; 1&amp;nbsp;&amp;nbsp;&amp;nbsp; 03/08/20&amp;nbsp; &amp;nbsp; &amp;nbsp;11:00&amp;nbsp; &amp;nbsp;03/08/20&amp;nbsp; &amp;nbsp;10:15&amp;nbsp; &amp;nbsp; &amp;nbsp;45&lt;/P&gt;&lt;P&gt;3&amp;nbsp; &amp;nbsp; &amp;nbsp; 1&amp;nbsp;&amp;nbsp;&amp;nbsp; 03/08/20&amp;nbsp; &amp;nbsp; &amp;nbsp;11:00&amp;nbsp; &amp;nbsp;03/08/20&amp;nbsp; &amp;nbsp;9:30&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;90&lt;/P&gt;&lt;P&gt;i want to delete last two observations. I want to do this for multiple IDs with multiple such rows.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks.&lt;/P&gt;</description>
      <pubDate>Mon, 09 Mar 2020 13:31:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Remove-duplicates-with-higher-values-within-a-variable/m-p/630625#M18890</guid>
      <dc:creator>kp19</dc:creator>
      <dc:date>2020-03-09T13:31:05Z</dc:date>
    </item>
    <item>
      <title>How to print out the observations from the set variables</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-print-out-the-observations-from-the-set-variables/m-p/630509#M18884</link>
      <description>&lt;P&gt;Hi SAS Pros,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have a data with a set of variables: drug1, drug2, drug3, ... until drug80. I am listing some of them right now.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Have:&lt;/P&gt;
&lt;TABLE style="border-collapse: collapse; width: 408pt;" border="0" width="544" cellspacing="0" cellpadding="0"&gt;
&lt;TBODY&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD width="68" height="19" style="height: 14.25pt; width: 51pt;"&gt;drug1&lt;/TD&gt;
&lt;TD width="68" style="width: 51pt;"&gt;drug2&lt;/TD&gt;
&lt;TD width="68" style="width: 51pt;"&gt;drug3&lt;/TD&gt;
&lt;TD width="68" style="width: 51pt;"&gt;drug4&lt;/TD&gt;
&lt;TD width="68" style="width: 51pt;"&gt;drug5&lt;/TD&gt;
&lt;TD width="68" style="width: 51pt;"&gt;drug6&lt;/TD&gt;
&lt;TD width="68" style="width: 51pt;"&gt;drug7&lt;/TD&gt;
&lt;TD width="68" style="width: 51pt;"&gt;drug8&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;a&lt;/TD&gt;
&lt;TD&gt;c&lt;/TD&gt;
&lt;TD&gt;e&lt;/TD&gt;
&lt;TD&gt;i&lt;/TD&gt;
&lt;TD&gt;b&lt;/TD&gt;
&lt;TD&gt;d&lt;/TD&gt;
&lt;TD&gt;h&lt;/TD&gt;
&lt;TD&gt;n&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;b&lt;/TD&gt;
&lt;TD&gt;c&lt;/TD&gt;
&lt;TD&gt;e&lt;/TD&gt;
&lt;TD&gt;g&lt;/TD&gt;
&lt;TD&gt;h&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;a&lt;/TD&gt;
&lt;TD&gt;b&lt;/TD&gt;
&lt;TD&gt;d&lt;/TD&gt;
&lt;TD&gt;h&lt;/TD&gt;
&lt;TD&gt;j&lt;/TD&gt;
&lt;TD&gt;k&lt;/TD&gt;
&lt;TD&gt;l&lt;/TD&gt;
&lt;TD&gt;o&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;b&lt;/TD&gt;
&lt;TD&gt;d&lt;/TD&gt;
&lt;TD&gt;h&lt;/TD&gt;
&lt;TD&gt;j&lt;/TD&gt;
&lt;TD&gt;k&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;h&lt;/TD&gt;
&lt;TD&gt;i&lt;/TD&gt;
&lt;TD&gt;j&lt;/TD&gt;
&lt;TD&gt;k&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;u&lt;/TD&gt;
&lt;TD&gt;v&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;x&lt;/TD&gt;
&lt;TD&gt;w&lt;/TD&gt;
&lt;TD&gt;y&lt;/TD&gt;
&lt;TD&gt;z&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;TD&gt;&amp;nbsp;&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Want: I want to print out all the drugs that in the above, like a drug list.&lt;/P&gt;
&lt;TABLE style="border-collapse: collapse; width: 51pt;" border="0" width="68" cellspacing="0" cellpadding="0"&gt;
&lt;TBODY&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD width="68" height="19" style="height: 14.25pt; width: 51pt;"&gt;drug_list&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;a&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;b&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;c&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;d&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;e&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;g&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;h&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;i&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;j&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;k&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;l&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;n&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;o&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;u&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;v&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;w&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;x&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;y&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR style="height: 14.25pt;"&gt;
&lt;TD height="19" style="height: 14.25pt;"&gt;z&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you in advance for any help!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Best regards,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;C&lt;/P&gt;</description>
      <pubDate>Sun, 08 Mar 2020 19:04:18 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-print-out-the-observations-from-the-set-variables/m-p/630509#M18884</guid>
      <dc:creator>CynthiaWei</dc:creator>
      <dc:date>2020-03-08T19:04:18Z</dc:date>
    </item>
    <item>
      <title>Find Observations where string starts with specific Characters</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Find-Observations-where-string-starts-with-specific-Characters/m-p/630367#M18880</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;I want to get all the observations where first name starts with Ro, Ay, Su OR Last name starts with Che, Ro.&lt;/P&gt;
&lt;P&gt;I know it's possible to code with Where, IF etc, but can someone help with the coding with Perl, please.&amp;nbsp; Thanks.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;data have;
infile datalines;
input id First_name$8. Last_name&amp;amp;$8.;
datalines;
101 Roy     Rose
102 Yao 	Chen
103 Sushan  Hash
104 Robin   Blue
105 Susan   Robert
106 Susan   Jane
107 Ayesha  Hasan
;
run;&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Sat, 07 Mar 2020 15:37:13 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Find-Observations-where-string-starts-with-specific-Characters/m-p/630367#M18880</guid>
      <dc:creator>mlogan</dc:creator>
      <dc:date>2020-03-07T15:37:13Z</dc:date>
    </item>
    <item>
      <title>How to use ENC_keys in amazon redshift bulk loading in SAS 9.4 M6</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-ENC-keys-in-amazon-redshift-bulk-loading-in-SAS-9-4/m-p/629860#M18877</link>
      <description>&lt;P&gt;HI Team,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;we are trying to encrypt our amazon s3 bucket&amp;nbsp; ACCESS ID &amp;amp; SECRET Keys. so i had followed below process but still getting no access key ID found error.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Steps followed:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;1. created&amp;nbsp;&lt;SPAN class="xisDoc-keyword"&gt;ENCKEY&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN class="xisDoc-choice"&gt;&lt;A title="Description of syntax: “ADD”" href="https://documentation.sas.com/?docsetId=proc&amp;amp;docsetTarget=p1jazfgokrxoy5n1w0occsip58kp.htm&amp;amp;docsetVersion=9.4&amp;amp;locale=en#n0aixhet1qn24un1jlfaczvavga8"&gt;ADD&lt;/A&gt;&amp;nbsp;statment (Xname)&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN class="xisDoc-choice"&gt;2. Now trying to use the same name but its not recognizing&amp;nbsp; &amp;nbsp;ENCKEY name and giving ACCESS KEY ID not&amp;nbsp; found error.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ERROR: Message from TKS3: No access key ID specified.&lt;/P&gt;&lt;P&gt;NOTE: The DATA step has been abnormally terminated.&lt;/P&gt;</description>
      <pubDate>Thu, 05 Mar 2020 16:17:02 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-ENC-keys-in-amazon-redshift-bulk-loading-in-SAS-9-4/m-p/629860#M18877</guid>
      <dc:creator>ganesh56271</dc:creator>
      <dc:date>2020-03-05T16:17:02Z</dc:date>
    </item>
    <item>
      <title>List Hadoop Tables *inside* SAS</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/List-Hadoop-Tables-inside-SAS/m-p/629959#M18864</link>
      <description>&lt;P&gt;I'm running SAS 9.4 M6 on a 64 bit Windows Server 2016 machine.&amp;nbsp; I have both Enterprise Guide (8.1) and Display Manager available to me, but I typically use EG.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I'm connecting to a Hadoop database.&amp;nbsp; I can create a Libname, or I can access the database via PROC SQL and explicit pass through.&amp;nbsp; Both methods are using ODBC.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;ISSUE:&lt;/STRONG&gt;&amp;nbsp; I can't get a list of Hadoop tables into my SAS session.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;If I know the table name in advance, I can successfully run a query, but it's darned inconvenient not to be able to get a list of tables.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In past shops, I've been able to see a list of the Hadoop tables after assigning a Libname in the SAS EG "Servers" panel.&amp;nbsp; In this case, I can see the Libname, but when I click on the drop down, there are no tables.&amp;nbsp; Display Manager behaves similarly.&amp;nbsp; See SAS code for Libname below.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I tried running a Proc SQL against&amp;nbsp;DICTIONARY.TABLES, but I got no rows in return.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I also tried using PROC DATASETS.&amp;nbsp; In my log, I get:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;29         PROC	DATASETS
30         	LIBRARY=HIVEDB
31         	;
ODBC: AUTOCOMMIT is YES for connection 1
ODBC: &lt;STRONG&gt;Called SQLTables with schema of NULL&lt;/STRONG&gt;
WARNING: No matching members in directory.
32         RUN;&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I can do a SHOW TABLES via an explicit pass through, but nothing gets returned to the SAS log.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Question 1&lt;/STRONG&gt;:&amp;nbsp; I notice in my log from my Proc Datasets the message, "Called SQLTables with schema of NULL".&amp;nbsp; Is there a way to tell Proc Datasets that I want a particular schema?&amp;nbsp; I did specify a schema on my Libname, but apparently it's not coded correctly or isn't getting passed to Proc Datasets.&amp;nbsp; Here's my Libname:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;libname hiveDB odbc noprompt="uid=jbarbour; pwd=XXXXXXXXX; dsn=OPSI_HIVE_STG1; 
host=dbms0502; port=10000;schema=edps; authmech=3";
&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Question 2&lt;/STRONG&gt;:&amp;nbsp; Is there a way I can redirect the output of SHOW TABLES to the SAS log?&amp;nbsp; I have the following options in effect:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;options SASTRACE=",,,ds" sastraceloc=saslog nostsuffix;
options source source2 mprint fullstimer notes fmterr;
options msglevel=i; 
&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you in advance,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Jim&lt;/P&gt;</description>
      <pubDate>Thu, 05 Mar 2020 21:39:58 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/List-Hadoop-Tables-inside-SAS/m-p/629959#M18864</guid>
      <dc:creator>jimbarbour</dc:creator>
      <dc:date>2020-03-05T21:39:58Z</dc:date>
    </item>
    <item>
      <title>update target table with another source table using partial variables</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/update-target-table-with-another-source-table-using-partial/m-p/629873#M18862</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I would like to learn how to update a target table with values from another table. There are about 1000 variables in the target table and only about 350 variables (chat or num variables) need to be updated from a renew dataset. The rules are&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;If the renew dataset has value in the variables then use that value in the target dataset.&lt;/LI&gt;&lt;LI&gt;If the renew dataset has missing value then keep the value in the target dataset.&lt;/LI&gt;&lt;LI&gt;If both target and renew datasets have missing value then keep the value as missing.&amp;nbsp;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have generated a very simple example.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;data target;&lt;BR /&gt;input id1$ id2 a b c$ d$ e f;&lt;BR /&gt;datalines;&lt;BR /&gt;1 10 6 8 x1 x2 9 3&lt;BR /&gt;2 20 5 . y1 . 6 .&lt;BR /&gt;;&lt;BR /&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;data renew;&lt;BR /&gt;input id1$ id2 a b c$ d$ e f;&lt;BR /&gt;datalines;&lt;BR /&gt;1 . 6 8 x10 . 9 30&lt;BR /&gt;2 20 . 3 . y20 6 .&lt;BR /&gt;;&lt;BR /&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;the data want will be&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;id1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;id2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;a&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;b&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;c&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;d&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;e&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;f&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;8&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;x10&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;x2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;9&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;30&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;&lt;P&gt;2&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;20&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;5&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;3&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;y1&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;y20&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;6&lt;/P&gt;&lt;/TD&gt;&lt;TD&gt;&lt;P&gt;.&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you so much.&lt;/P&gt;</description>
      <pubDate>Thu, 05 Mar 2020 16:43:07 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/update-target-table-with-another-source-table-using-partial/m-p/629873#M18862</guid>
      <dc:creator>CHL0320</dc:creator>
      <dc:date>2020-03-05T16:43:07Z</dc:date>
    </item>
  </channel>
</rss>

