<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic DI APPEND in SAS Data Management</title>
    <link>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/717952#M19782</link>
    <description>&lt;P&gt;I have an APPEND process that duplicates data if I run it twice. Is there some way to avoid data duplication using APPEND?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Tue, 09 Feb 2021 17:14:09 GMT</pubDate>
    <dc:creator>Rogerio_Alves</dc:creator>
    <dc:date>2021-02-09T17:14:09Z</dc:date>
    <item>
      <title>DI APPEND</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/717952#M19782</link>
      <description>&lt;P&gt;I have an APPEND process that duplicates data if I run it twice. Is there some way to avoid data duplication using APPEND?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 09 Feb 2021 17:14:09 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/717952#M19782</guid>
      <dc:creator>Rogerio_Alves</dc:creator>
      <dc:date>2021-02-09T17:14:09Z</dc:date>
    </item>
    <item>
      <title>Re: DI APPEND</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718207#M19785</link>
      <description>Add some code after PROC APPEND to remove the duplicated obs . Like:&lt;BR /&gt;&lt;BR /&gt;proc append base=have data=temp force;run;&lt;BR /&gt;proc sql;&lt;BR /&gt;create table want as&lt;BR /&gt;select distinct * from have;&lt;BR /&gt;quit;</description>
      <pubDate>Wed, 10 Feb 2021 12:29:11 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718207#M19785</guid>
      <dc:creator>Ksharp</dc:creator>
      <dc:date>2021-02-10T12:29:11Z</dc:date>
    </item>
    <item>
      <title>Re: DI APPEND</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718234#M19786</link>
      <description>&lt;P&gt;For production ready code, you shouldn't be able to run the process twice with the same data.&lt;/P&gt;
&lt;P&gt;If you can't gurantee that, redesign the job to either have an Update loading strategy, or as&amp;nbsp;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/18408"&gt;@Ksharp&lt;/a&gt;&amp;nbsp;suggests - have a post process that removes duplicates.&lt;/P&gt;
&lt;P&gt;This might be fine if your data is manageble, but it will affect the performance for sure.&lt;/P&gt;</description>
      <pubDate>Wed, 10 Feb 2021 13:31:20 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718234#M19786</guid>
      <dc:creator>LinusH</dc:creator>
      <dc:date>2021-02-10T13:31:20Z</dc:date>
    </item>
    <item>
      <title>Re: DI APPEND</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718261#M19787</link>
      <description>Tks a lot for suggestion! Do you think proc sort could be more performatic?</description>
      <pubDate>Wed, 10 Feb 2021 14:44:23 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718261#M19787</guid>
      <dc:creator>ralves2</dc:creator>
      <dc:date>2021-02-10T14:44:23Z</dc:date>
    </item>
    <item>
      <title>Re: DI APPEND</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718532#M19788</link>
      <description>I think both have same performance , due to both take multi session to run. &lt;BR /&gt;But you could test it . I welcome you post the compare result .</description>
      <pubDate>Thu, 11 Feb 2021 11:16:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718532#M19788</guid>
      <dc:creator>Ksharp</dc:creator>
      <dc:date>2021-02-11T11:16:05Z</dc:date>
    </item>
    <item>
      <title>Re: DI APPEND</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718552#M19789</link>
      <description>Tks a lot for all attention here. I will try both and I let you know.&lt;BR /&gt;</description>
      <pubDate>Thu, 11 Feb 2021 12:09:33 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/718552#M19789</guid>
      <dc:creator>ralves2</dc:creator>
      <dc:date>2021-02-11T12:09:33Z</dc:date>
    </item>
    <item>
      <title>DI APPEND -</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/721810#M19809</link>
      <description>&lt;P&gt;To avoid a problem to duplicate data with an APPEND I created an INDEX on DI with some columns. When I run the job it is executed but send this message and JOB is finished with EXIT on FLOW MANAGER.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;ERROR ON LOG:&lt;BR /&gt;&lt;BR /&gt;Add/Update failed for data set &amp;lt;TABLE TARGET&amp;gt; because data value(s) do not comply with integrity constraint.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;How could I solve that?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks!&lt;/P&gt;</description>
      <pubDate>Thu, 25 Feb 2021 11:18:08 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/721810#M19809</guid>
      <dc:creator>Rogerio_Alves</dc:creator>
      <dc:date>2021-02-25T11:18:08Z</dc:date>
    </item>
    <item>
      <title>Re: DI APPEND -</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/723239#M19825</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/109541"&gt;@Rogerio_Alves&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I wonder how you come to get duplicate data. The DI Studio Append Transformation (at least in my version 4.9) always deletes the output&amp;nbsp;table before appending (check the transformation code in properties -&amp;gt; code), so it should be impossible to get duplicates unless your output table is input to a step before the append, so existing data goes into the appendtogether with new data.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;It is a little bit difficult to imagine what's going on, so please post a screenshot of your job canvas.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 03 Mar 2021 18:39:11 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/DI-APPEND/m-p/723239#M19825</guid>
      <dc:creator>ErikLund_Jensen</dc:creator>
      <dc:date>2021-03-03T18:39:11Z</dc:date>
    </item>
  </channel>
</rss>

