<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Joining SAS Dataset with Teradata table in SAS Data Management</title>
    <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349915#M10531</link>
    <description>&lt;P&gt;I only can read tables from teradata. I don't have write permission.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
    <pubDate>Fri, 14 Apr 2017 00:08:30 GMT</pubDate>
    <dc:creator>SuryaKiran</dc:creator>
    <dc:date>2017-04-14T00:08:30Z</dc:date>
    <item>
      <title>Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349913#M10529</link>
      <description>&lt;P&gt;I have a SAS Dataset with 5 million records and a teradata table with 400 million records for left join on one key variable in PROC SQL. I need 7 variables to keep from SAS Dataset and 2 two from teradata table. I am trying for more efficient way to joins these tables.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any suggestions!&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Environment: SAS EG 7.1, SAS 9.4 Grid on linux.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Note: No indexes on the teradata table or not permitted to create&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 14 Apr 2017 00:03:15 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349913#M10529</guid>
      <dc:creator>SuryaKiran</dc:creator>
      <dc:date>2017-04-14T00:03:15Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349914#M10530</link>
      <description>&lt;P&gt;move the SAS table into Teradata and join the table in Teradata.&lt;/P&gt;</description>
      <pubDate>Fri, 14 Apr 2017 00:06:44 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349914#M10530</guid>
      <dc:creator>kiranv_</dc:creator>
      <dc:date>2017-04-14T00:06:44Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349915#M10531</link>
      <description>&lt;P&gt;I only can read tables from teradata. I don't have write permission.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Fri, 14 Apr 2017 00:08:30 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349915#M10531</guid>
      <dc:creator>SuryaKiran</dc:creator>
      <dc:date>2017-04-14T00:08:30Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349918#M10532</link>
      <description>&lt;P&gt;see whether you can create a volatile table, which probably should have access and then do you can join with a volatile table.&lt;/P&gt;</description>
      <pubDate>Fri, 14 Apr 2017 00:14:52 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349918#M10532</guid>
      <dc:creator>kiranv_</dc:creator>
      <dc:date>2017-04-14T00:14:52Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349921#M10533</link>
      <description>&lt;P&gt;Please check this link answered by SAS employee may be helpful.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://communities.sas.com/t5/Base-SAS-Programming/Subseting-Teradata-table-with-SAS-dataset/m-p/22553#U22553" target="_blank"&gt;https://communities.sas.com/t5/Base-SAS-Programming/Subseting-Teradata-table-with-SAS-dataset/m-p/22553#U22553&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 14 Apr 2017 00:29:08 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349921#M10533</guid>
      <dc:creator>kiranv_</dc:creator>
      <dc:date>2017-04-14T00:29:08Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349948#M10534</link>
      <description>you can use hash programming to join tables instead of proc sql.</description>
      <pubDate>Fri, 14 Apr 2017 02:37:04 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349948#M10534</guid>
      <dc:creator>lakshmi_74</dc:creator>
      <dc:date>2017-04-14T02:37:04Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349989#M10535</link>
      <description>A hash would not solve this particular issue, which is risking to move a whole TD table into SAS for the join.&lt;BR /&gt;So first look in the thread that &lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/37783"&gt;@kiranv_&lt;/a&gt; links to.&lt;BR /&gt;An additional tip is the DBKEY option.</description>
      <pubDate>Fri, 14 Apr 2017 09:39:18 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/349989#M10535</guid>
      <dc:creator>LinusH</dc:creator>
      <dc:date>2017-04-14T09:39:18Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/353221#M10537</link>
      <description>&lt;P&gt;A straight forward approach that should work would be to:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;1. First Bulk Load the SAS dataset to a Teradata table&lt;/P&gt;
&lt;P&gt;&lt;A href="https://support.sas.com/documentation/cdl/en/acreldb/63647/HTML/default/viewer.htm#a001405937.htm" target="_blank"&gt;https://support.sas.com/documentation/cdl/en/acreldb/63647/HTML/default/viewer.htm#a001405937.htm&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;2. join the two Teradata tables (could use an implicit join with SAS PROC SQL, or an explicit SQL passthrough join).&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;3.&amp;nbsp; Then, if necessary, pull the resulting join back down to SAS as a SAS dataset.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 25 Apr 2017 15:15:43 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/353221#M10537</guid>
      <dc:creator>DavidGhan</dc:creator>
      <dc:date>2017-04-25T15:15:43Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/363665#M10958</link>
      <description>&lt;P&gt;I tried to insert data into&lt;STRONG&gt; volatile table&lt;/STRONG&gt; in teradata as mentioned by &lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/37783"&gt;@kiranv_&lt;/a&gt;&amp;nbsp;and then by using explicit sql-passthrough to get my data into sas. The issue here is I have 1 million SAS data to be inserted into volatile table, this is taking long time. I belive we cannot use FASTLOAD or MULTILOAD if we are inserting into volatile table. I used TPT and MULTISTMT, but my problem is not solved. Is there a way I can inprove the performance.&lt;/P&gt;&lt;P&gt;Note: I only can create volatile tables in Teradata.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;LIBNAME TD TERADATA SERVER=ABCD   USER=XXXX PASSWORD=XXXXXX CONNECTION=GLOBAL DBMSTEMP=YES;

PROC DELETE DATA=TD.temp1;
RUN;

proc sql;
  connect to teradata(SERVER=ABCD   USER=XXXX PASSWORD=XXXXXX CONNECTION=GLOBAL);
 execute
  (CREATE VOLATILE TABLE temp1(NAME CHAR(10), STATE CHAR(3)) 
		ON COMMIT PRESERVE ROWS) 	by teradata;
  execute ( COMMIT WORK ) by teradata;
 quit;

options sastrace=',,,d' sastraceloc=saslog;
proc append  base=TD.temp1(TPT=YES multistmt=yes DBCOMMIT=10000) data=WORK.HOME FORCE ;
run;&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 02 Jun 2017 01:47:42 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/363665#M10958</guid>
      <dc:creator>SuryaKiran</dc:creator>
      <dc:date>2017-06-02T01:47:42Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/363674#M10959</link>
      <description>&lt;P&gt;this could be due to &amp;nbsp;primary index. Here name is your primary index. If you do not explicitly mention primary index, then your first column will become primary index. Is your name column unique, if not, it will leading to skewing , which in turn will effect your performance. If I were you, I will do few changes.&lt;/P&gt;
&lt;P&gt;1. I will create your volatile table as multiset table with no primary index. (no PI is best for staging tables as data can be inserted fastly because data can be loaded into this tables randomly). Multiset table will not check for &amp;nbsp;duplicate records&amp;nbsp;further enhancing performance. If you do not mention set or multiset table in ANSI mode, set tables are created, Set tables will check for duplicates )&lt;/P&gt;
&lt;P&gt;2. I will name as varchar(10) instead of char(10)&lt;/P&gt;
&lt;P&gt;please check the code below.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE class=" language-sas"&gt;&lt;CODE class="  language-sas"&gt;&lt;SPAN class="token procnames"&gt;proc&lt;/SPAN&gt; &lt;SPAN class="token procnames"&gt;sql&lt;/SPAN&gt;&lt;SPAN class="token punctuation"&gt;;&lt;/SPAN&gt;
  connect to teradata&lt;SPAN class="token punctuation"&gt;(&lt;/SPAN&gt;&lt;SPAN class="token procnames"&gt;SERVER&lt;/SPAN&gt;&lt;SPAN class="token operator"&gt;=&lt;/SPAN&gt;ABCD   USER&lt;SPAN class="token operator"&gt;=&lt;/SPAN&gt;XXXX PASSWORD&lt;SPAN class="token operator"&gt;=&lt;/SPAN&gt;XXXXXX CONNECTION&lt;SPAN class="token operator"&gt;=&lt;/SPAN&gt;GLOBAL&lt;SPAN class="token punctuation"&gt;)&lt;/SPAN&gt;&lt;SPAN class="token punctuation"&gt;;&lt;/SPAN&gt;
 &lt;SPAN class="token keyword"&gt;execute&lt;/SPAN&gt;
  &lt;SPAN class="token punctuation"&gt;(&lt;/SPAN&gt;CREATE multiset VOLATILE &lt;SPAN class="token statement"&gt;TABLE&lt;/SPAN&gt; temp1&lt;SPAN class="token punctuation"&gt;(&lt;/SPAN&gt;NAME VAR&lt;SPAN class="token function"&gt;CHAR&lt;/SPAN&gt;&lt;SPAN class="token punctuation"&gt;(&lt;/SPAN&gt;&lt;SPAN class="token number"&gt;10&lt;/SPAN&gt;&lt;SPAN class="token punctuation"&gt;)&lt;/SPAN&gt;&lt;SPAN class="token punctuation"&gt;,&lt;/SPAN&gt; STATE &lt;SPAN class="token function"&gt;CHAR&lt;/SPAN&gt;&lt;SPAN class="token punctuation"&gt;(&lt;/SPAN&gt;&lt;SPAN class="token number"&gt;3))&lt;/SPAN&gt;
		 &lt;SPAN&gt;NO PRIMARY INDEX ON COMMIT PRESERVE ROWS&lt;SPAN class="token punctuation"&gt;)&lt;/SPAN&gt;&lt;/SPAN&gt;	&lt;SPAN class="token statement"&gt;by&lt;/SPAN&gt; teradata&lt;SPAN class="token punctuation"&gt;;&lt;/SPAN&gt;
  &lt;SPAN class="token keyword"&gt;execute&lt;/SPAN&gt; &lt;SPAN class="token punctuation"&gt;(&lt;/SPAN&gt; COMMIT WORK &lt;SPAN class="token punctuation"&gt;)&lt;/SPAN&gt; &lt;SPAN class="token statement"&gt;by&lt;/SPAN&gt; teradata&lt;SPAN class="token punctuation"&gt;;&lt;/SPAN&gt;
 &lt;SPAN class="token procnames"&gt;quit&lt;/SPAN&gt;&lt;SPAN class="token punctuation"&gt;;&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 02 Jun 2017 03:35:14 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/363674#M10959</guid>
      <dc:creator>kiranv_</dc:creator>
      <dc:date>2017-06-02T03:35:14Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364307#M10975</link>
      <description>&lt;P&gt;I did as you mentioned, but still its taking 2 hours to send 1 million SAS Data to Volatile Table&lt;/P&gt;</description>
      <pubDate>Mon, 05 Jun 2017 16:15:15 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364307#M10975</guid>
      <dc:creator>SuryaKiran</dc:creator>
      <dc:date>2017-06-05T16:15:15Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364317#M10976</link>
      <description>&lt;P&gt;I did some bechmarking in our own environment and&amp;nbsp;what I found in our environment was 1 million records records was taking approximately 2 minutes(without fastload).this table&amp;nbsp;has around 150 columns.&lt;/P&gt;
&lt;P&gt;I do not know how much spool space is available for you, whether that has any impact on this. If possible, please give us some info like how many columns are there in your table. IS this only process, which takes time.&lt;/P&gt;
&lt;P&gt;if possible,&amp;nbsp;Try loading your table when the workload is very less and see whether that does have any impact. I would ask &lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/51161"&gt;@JBailey&lt;/a&gt;&amp;nbsp;, who is an expert and wrote&amp;nbsp;quite few&amp;nbsp;articles on moving data from SAS into teradata.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;one of best articles of &lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/51161"&gt;@JBailey&lt;/a&gt;&amp;nbsp;on moving data from SAS to teradata is given below, which could be of some help.&lt;/P&gt;
&lt;P&gt;&lt;A href="https://support.sas.com/resources/papers/EffectivelyMovingSASDataintoTeradata.pdf" target="_blank"&gt;https://support.sas.com/resources/papers/EffectivelyMovingSASDataintoTeradata.pdf&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 05 Jun 2017 17:10:24 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364317#M10976</guid>
      <dc:creator>kiranv_</dc:creator>
      <dc:date>2017-06-05T17:10:24Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364398#M10978</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/83078"&gt;@SuryaKiran&lt;/a&gt;&lt;/P&gt;
&lt;P&gt;Make sure you also set DBCOMMIT=0&lt;/P&gt;
&lt;P&gt;&lt;A href="http://support.sas.com/documentation/cdl/en/acreldb/69580/HTML/default/viewer.htm#p0he4t6yjfmkhpn16qrf0cdhllu6.htm" target="_blank"&gt;http://support.sas.com/documentation/cdl/en/acreldb/69580/HTML/default/viewer.htm#p0he4t6yjfmkhpn16qrf0cdhllu6.htm&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;2 hours for 1M records is seriously slow. Is reading data from Teradata also that slow?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 05 Jun 2017 22:23:46 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364398#M10978</guid>
      <dc:creator>Patrick</dc:creator>
      <dc:date>2017-06-05T22:23:46Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364540#M10986</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/83078"&gt;@SuryaKiran&lt;/a&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;It has been a while since I have played around with this. I will take some time and see what I can come up with. I will try to get back here by Friday.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Does the DDL for temp1 reflect what you are actually loading into the database?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In the meantime: One suggestion, discuss this with your DBA and see if they will create a permanent table for you. They can let you load it and delete all the rows but not drop it. This will could allow you to FASTLOAD into it and then work from there.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I will let you know what I find.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Best wishes,&lt;/P&gt;
&lt;P&gt;Jeff&lt;/P&gt;</description>
      <pubDate>Tue, 06 Jun 2017 11:58:39 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364540#M10986</guid>
      <dc:creator>JBailey</dc:creator>
      <dc:date>2017-06-06T11:58:39Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364541#M10987</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/37783"&gt;@kiranv_&lt;/a&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;You are way too kind!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I really need to take some time and update the doc you referenced, but the content is mostly relevant.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Best wishes,&lt;/P&gt;
&lt;P&gt;Jeff&lt;/P&gt;</description>
      <pubDate>Tue, 06 Jun 2017 12:00:42 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/364541#M10987</guid>
      <dc:creator>JBailey</dc:creator>
      <dc:date>2017-06-06T12:00:42Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365223#M11010</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;- I did some test on different Teradata servers, it is taking almost the same time on all of them. When I tried to create a Volatile &amp;nbsp;table from Teradata SQL assist on the same server with the data that was there in the same server it took few seconds for 500,000.&lt;/P&gt;&lt;P&gt;-&amp;nbsp;I tested with 100,000 records with only 1 column and have fixed length of 9 without duplicated and it took me 6 min in SAS&lt;/P&gt;&lt;P&gt;- I mentioned DBCOMMIT=0&amp;nbsp;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/12447"&gt;@Patrick&lt;/a&gt;&amp;nbsp;and MULTISTMT=YES, but if I see the log I think SAS is inserting by each row individually.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;SAS Log info:&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;27  LIBNAME TEST TERADATA SERVER=ABC  USER="XXXXXXXX" PASSWORD="&amp;amp;PW." DBCOMMIT=0 CONNECTION=GLOBAL DBMSTEMP=YES ;
NOTE: Libref TEST was successfully assigned as follows: 
      Engine:        TERADATA 
      Physical Name: ABC
28         PROC DELETE DATA=test.temp1;
29         RUN;

TERADATA_19: Prepared: on connection 1
SELECT * FROM "temp1"
 
TERADATA: trforc: COMMIT WORK 
WARNING: File TEST.temp1.DATA does not exist.
NOTE: PROCEDURE DELETE used (Total process time):
      real time           0.09 seconds
      user cpu time       0.01 seconds
      system cpu time     0.00 seconds
      memory              119.50k
      OS Memory           36776.00k
      Timestamp           06/07/2017 10:06:21 PM
      Step Count                        29  Switch Count  38
      Page Faults                       0
      Page Reclaims                     14
      Page Swaps                        0
      Voluntary Context Switches        97
      Involuntary Context Switches      0
      Block Input Operations            0
      Block Output Operations           0
     
30         /* Creating a volatile table in Teradata from SAS Table*/
31         proc sql;
32           connect to teradata(SERVER=ABC  USER="XXXXXXX"
32       ! PASSWORD="&amp;amp;PW." CONNECTION=GLOBAL);

33          execute
34           (CREATE MULTISET VOLATILE TABLE temp1(Var CHAR(9))
35         	 NO PRIMARY INDEX ON COMMIT PRESERVE ROWS) 	by teradata;
 
TERADATA_20: Executed: on connection 1
CREATE MULTISET VOLATILE TABLE temp1(Var CHAR(9)) NO PRIMARY INDEX ON COMMIT PRESERVE ROWS
 
36           execute ( COMMIT WORK ) by teradata;
 
TERADATA_21: Executed: on connection 1
COMMIT WORK
 
37          quit;
NOTE: PROCEDURE SQL used (Total process time):
      real time           0.29 seconds
      user cpu time       0.01 seconds
      system cpu time     0.00 seconds
      memory              96.56k
      OS Memory           36776.00k
      Timestamp           06/07/2017 10:06:21 PM
      Step Count                        30  Switch Count  60
      Page Faults                       0
      Page Reclaims                     8
      Page Swaps                        0
      Voluntary Context Switches        187
      Involuntary Context Switches      0
      Block Input Operations            0
      Block Output Operations           8
      

38         
39         options sastrace=',,,d' sastraceloc=saslog;
 
TERADATA_22: Prepared: on connection 1
SELECT * FROM "temp1"
 
TERADATA: trforc: COMMIT WORK 
 
TERADATA_23: Prepared: on connection 1
SELECT * FROM "temp1"
 
40         proc append  base=TEST.temp1 (multistmt=yes ) data=GRIDWORK.sample1 FORCE ;
41         run;

NOTE: Appending GRIDWORK.SAMPLE1 to TEST.temp1.
 
TERADATA_24: Prepared: on connection 1
USING (A0A CHAR (9),A0B CHAR (9),A0C CHAR (9),A0D CHAR (9),A0E CHAR (9),A0F CHAR (9),A0G CHAR (9),A0H CHAR (9),A0I CHAR (9),A0J 
CHAR (9),A0K CHAR (9),A0L CHAR (9),A0M CHAR (9),A0N CHAR (9),A0O
................................................................................................................................................................................................................................................................ CHAR (9),B26 CHAR (9),B27 CHAR (9),B28 CHAR (9),B29 CHAR (9),B3A CHAR (9),B3B CHAR (9),B3C CHAR (9),B3D CHAR (9),B3E CHAR (9),B3F 
CHAR (9),B3G CHAR (9),B3H CHAR (9),B3I CHAR (9),B3J CHAR (9),B3K C4M CHAR (9),C4N CHAR (9),C4O CHAR (9),C4P CHAR (9),C4Q CHAR (9),C4R CHAR (9),C4S CHAR (9),C4T 
CHAR (9),C4U CHAR (9))INSERT INTO "temp1" ("Var")VALUES (:A0A);INSERT INTO "temp1" ("Var")VALUES (:A0B);INSERT INTO "temp1" 
("Var")VALUES (:A0C);INSERT INTO "temp1" ("Var")VALUES (:A0D);INSERT INTO "temp1" ("Var")VALUES (:A0E);INSERT INTO "temp1" 
("Var")VALUES (:A0F);INSERT INTO "temp1" ("Var")VALUES (:A0G);
................................................................................................................................................................................................
("Var")VALUES (:C4R);INSERT INTO "temp1" ("Var")VALUES (:C4S);INSERT INTO "temp1" ("Var")VALUES (:C4T);INSERT INTO "temp1" 
("Var")VALUES (:C4U);
 
NOTE: There were 100000 observations read from the data set GRIDWORK.SAMPLE1.
NOTE: 100000 observations added.
NOTE: The data set TEST.temp1 has . observations and 1 variables.
TERADATA: trforc: COMMIT WORK 
NOTE: PROCEDURE APPEND used (Total process time):
      real time           6:04.39
      user cpu time       0.08 seconds
      system cpu time     0.07 seconds
      memory              1339.15k
      OS Memory           38064.00k
      Timestamp           06/07/2017 10:12:26 PM
      Step Count                        31  Switch Count  1126
      Page Faults                       3
      Page Reclaims                     159
      Page Swaps                        0
      Voluntary Context Switches        5167
      Involuntary Context Switches      2
      Block Input Operations            0
      Block Output Operations           208
      


&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 08 Jun 2017 02:51:43 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365223#M11010</guid>
      <dc:creator>SuryaKiran</dc:creator>
      <dc:date>2017-06-08T02:51:43Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365344#M11013</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/83078"&gt;@SuryaKiran&lt;/a&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Your results closely match what I experienced yesterday. I created a 2M row data set and it took 1:57 (one hour and 57 minutes) to move it into a VOLATILE table. I backed off the size to 50K rows (2 columns - $10 and $3).&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The first test I ran was a simple INSERT (no MULTISMT=YES) and it took slightly over 4 minutes to insert the 50k rows. Next, I specified TPT=NO MULTISTMT=YES on the target table. The 50K rows were INSERTed in just under 3 minutes. Which isn't great but it is ~33% faster than doing nothing. We can't use TPT=YES because it opens its own connection and cannot see the VOLATILE table.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I used TPT=YES FASTLOAD=YES and was able to load the 2M row table in under 7 seconds. I think it is clear that you really want to use FASTLOAD.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Given your situation, I would describe the problem to your DBA. They could create an empty table for you and allow you to FASTLOAD into it, and delete the contents. If this approach is taken be sure to check out the link, below, on DELETEing data fast. They may allow you to create a table in a playpen. Either of these approached would be&amp;nbsp;greatly preferred to using a VOLATILE table.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;When dealing with your DBA express concern that by not being able to create a table you will be using more Teradata resources than required. Point out that if you aren't able to move the small set of data to the server then SAS may unnecessarily read large amounts of data from Teradata into SAS (SAS will perform the join when it is best performed by Teradata).&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;When I was in SAS consulting I dealt with this type of thing quite often. If you explain it as "I am trying to make efficient use of database resources" you have a good chance of having them help you.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;So, ask for permission to create the table in a playpen and if they say "no" then ask them to create the table for you. Either way it is better than the VOLATILE table strategy.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Once you have permission to create your table you will want to ensure that it is indexed proper and has up-to-date statistics. This will help make the join processing as efficient as possible. It may be fine to load the table, then take statistics one time. This could work because your table will be similar each time it is loaded because the data is deleted prior to each new load. I think I discuss statistics in the Troubleshooting SAS and Teradata Query Performance Problems paper.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;You may find these helpful:&lt;/P&gt;
&lt;P&gt;&lt;A href="https://support.sas.com/resources/papers/EffectivelyMovingSASDataintoTeradata.pdf" target="_self"&gt;Effectively Moving SAS Data into Teradata&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://support.sas.com/resources/papers/TroubleshootingSASandTeradataQueryPerformanceProblems.pdf" target="_self"&gt;Troubleshooting SAS and Teradata Query Performance Problems&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://support.sas.com/resources/papers/proceedings11/142-2011.pdf" target="_self"&gt;Teradata Parallel Transporter: Loading Your SAS Data Just Became Easier&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="http://www.scsug.org/wp-content/uploads/2013/11/The-SQL-Tuning-Checklist-Jeff-Bailey-SAS.pdf" target="_self"&gt;The SQL Tuning Checklist: Making Slow Database Queries a Thing of the Past&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://communities.sas.com/t5/SAS-Communities-Library/DBIDIRECTEXEC-GOFAST-YES-for-Database-Processing/ta-p/342717" target="_self"&gt;DBIDIRECTEXEC: GOFAST=YES for Database Processing&lt;/A&gt;&amp;nbsp;(SAS Communities Article)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://communities.sas.com/t5/SAS-Communities-Library/DELETE-Your-Teradata-Data-Fast/ta-p/221055" target="_self"&gt;DELETE Your Teradata Data Fast! &lt;/A&gt;(This will help if the DBA creates a table for you)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Best wishes,&lt;/P&gt;
&lt;P&gt;Jeff&lt;/P&gt;</description>
      <pubDate>Thu, 08 Jun 2017 13:01:08 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365344#M11013</guid>
      <dc:creator>JBailey</dc:creator>
      <dc:date>2017-06-08T13:01:08Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365428#M11018</link>
      <description>&lt;P&gt;That is great insight &lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/51161"&gt;@JBailey&lt;/a&gt;. So it is &amp;nbsp;loading from SAS to volatile table is what &amp;nbsp;causes&amp;nbsp; performance issues.&lt;/P&gt;</description>
      <pubDate>Thu, 08 Jun 2017 15:53:52 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365428#M11018</guid>
      <dc:creator>kiranv_</dc:creator>
      <dc:date>2017-06-08T15:53:52Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365764#M11025</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/51161"&gt;@JBailey&lt;/a&gt;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 10.5pt; font-family: 'Helvetica',sans-serif; color: #333333;"&gt;These are great resources and very helpful. What if there are many users trying to join SAS table(1M) with Teradata table(300M), I think the DBA's will not provide each user with each table for scratch work. This will be a waste of resource and extra pain for them to manage these table usage properly especially very big organizations where there are thousands of SAS users. If there is a way to fastload data into volatile table then users doesn’t need any extra resources.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 10.5pt; font-family: 'Helvetica',sans-serif; color: #333333;"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 10.5pt; font-family: 'Helvetica',sans-serif; color: #333333;"&gt;Even though it took me like 2.30 hours to send my SAS data into Volatile table and then do my joins in Teradata and get back my result into SAS, this is far better than the way doing the joins in SAS which took me almost 9 hours. I believe if there is a fastload for volatile table this can be done in few minutes. &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 10.5pt; font-family: 'Helvetica',sans-serif; color: #333333;"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 10.5pt; font-family: 'Helvetica',sans-serif; color: #333333;"&gt;By any chance is SAS Institute working on increasing the performance when loading SAS data into volatile table (FASTLOAD for Volatile Table)&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Suryakiran&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 09 Jun 2017 16:44:22 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365764#M11025</guid>
      <dc:creator>SuryaKiran</dc:creator>
      <dc:date>2017-06-09T16:44:22Z</dc:date>
    </item>
    <item>
      <title>Re: Joining SAS Dataset with Teradata table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365767#M11026</link>
      <description>&lt;P&gt;fastload is teradata utility, so I donot think SAS can do anything there.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;all database tables are assigned to group rather than a person. It is resposibilty of the group to maintain their alloted perm space. At my previous work place all the staging permanent tables&amp;nbsp; were immediately dropped after their usage is done, usually drop staging table statement&amp;nbsp;was there in almost all jobs.&lt;/P&gt;</description>
      <pubDate>Fri, 09 Jun 2017 17:26:30 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Joining-SAS-Dataset-with-Teradata-table/m-p/365767#M11026</guid>
      <dc:creator>kiranv_</dc:creator>
      <dc:date>2017-06-09T17:26:30Z</dc:date>
    </item>
  </channel>
</rss>

