<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Loading huge data into snowflake table in SAS Data Management</title>
    <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894460#M20843</link>
    <description>&lt;P&gt;Here contains log details. SAS is reading 80k records but why it's considering as nulls even though data is present?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="samanvi_1-1694778246059.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/88029i39E1F9C860C08F41/image-size/medium?v=v2&amp;amp;px=400" role="button" title="samanvi_1-1694778246059.png" alt="samanvi_1-1694778246059.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="samanvi_2-1694778274979.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/88030i34908B7533AF94D9/image-size/medium?v=v2&amp;amp;px=400" role="button" title="samanvi_2-1694778274979.png" alt="samanvi_2-1694778274979.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="samanvi_3-1694778339998.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/88031iACA60A4B1D4E6840/image-size/medium?v=v2&amp;amp;px=400" role="button" title="samanvi_3-1694778339998.png" alt="samanvi_3-1694778339998.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 15 Sep 2023 11:51:40 GMT</pubDate>
    <dc:creator>samanvi</dc:creator>
    <dc:date>2023-09-15T11:51:40Z</dc:date>
    <item>
      <title>Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894408#M20835</link>
      <description>HI Every one, Need your help. We are pulling data from sqlserver and loading data into snowflake table. We are using proc append with force option but during the process we are getting CLI error and the job is getting failed. Here we are dealing with 60Mill records. I used bulk option too throwing some error on amount column seeing character data "ABC".</description>
      <pubDate>Fri, 15 Sep 2023 01:06:39 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894408#M20835</guid>
      <dc:creator>samanvi</dc:creator>
      <dc:date>2023-09-15T01:06:39Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894411#M20836</link>
      <description>&lt;P&gt;Sounds like a data type mismatch but you will have to provide much more information for us to be of any help.&lt;/P&gt;
&lt;P&gt;If this is just a replication of SQL server tables to Snowflake without any transformations then also consider if you could run a process that doesn't pass the data through SAS but directly loads from SQL Server to Snowflake. You could still use SAS to trigger the process. Such a direct load should perform better but you would of course need all the direct connectivity between SQL Server and Snowflake established.&lt;/P&gt;
&lt;P&gt;&lt;A href="https://estuary.dev/sql-server-to-snowflake/" target="_self"&gt;&amp;nbsp;4 Methods to Transfer Data from SQL Server to Snowflake&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 01:22:47 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894411#M20836</guid>
      <dc:creator>Patrick</dc:creator>
      <dc:date>2023-09-15T01:22:47Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894412#M20837</link>
      <description>&lt;P&gt;forgot to add code&lt;/P&gt;&lt;P&gt;{code}&lt;/P&gt;&lt;P&gt;libname abc sql server uid-xxx pwd=xxx;&lt;/P&gt;&lt;P&gt;libname def sasiosnf insertbuffer=5000 server ='' uid='' pwd'';&lt;/P&gt;&lt;P&gt;proc sql;&lt;/P&gt;&lt;P&gt;create table have&lt;/P&gt;&lt;P&gt;as&lt;/P&gt;&lt;P&gt;select monotonic() as key,&lt;/P&gt;&lt;P&gt;&amp;nbsp; *&lt;/P&gt;&lt;P&gt;from abc.test;&lt;/P&gt;&lt;P&gt;quit;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;proc append base=def.want data=have force;&lt;/P&gt;&lt;P&gt;run;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 01:29:42 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894412#M20837</guid>
      <dc:creator>samanvi</dc:creator>
      <dc:date>2023-09-15T01:29:42Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894413#M20838</link>
      <description>&lt;P&gt;Thank for your response. There are other process even running in the code where we pull data from multiple files do data cleansing and loading them to the snowflake tables. Those are not causing any issue but this process we are creating a key column to the orginal data and adding data to the snowflake table .&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 01:34:37 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894413#M20838</guid>
      <dc:creator>samanvi</dc:creator>
      <dc:date>2023-09-15T01:34:37Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894416#M20839</link>
      <description>&lt;P&gt;Please post the full SAS log so we can see notes and errors. I don't see any use of bulk loading.&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 01:42:00 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894416#M20839</guid>
      <dc:creator>SASKiwi</dc:creator>
      <dc:date>2023-09-15T01:42:00Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894420#M20840</link>
      <description>&lt;P&gt;error:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="samanvi_0-1694744202464.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/88024iA11F1FF3C544634F/image-size/medium?v=v2&amp;amp;px=400" role="button" title="samanvi_0-1694744202464.png" alt="samanvi_0-1694744202464.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 02:16:50 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894420#M20840</guid>
      <dc:creator>samanvi</dc:creator>
      <dc:date>2023-09-15T02:16:50Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894423#M20841</link>
      <description>&lt;P&gt;We need to see both the source code and the notes and errors together so we can see which statements are causing them.&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 02:32:47 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894423#M20841</guid>
      <dc:creator>SASKiwi</dc:creator>
      <dc:date>2023-09-15T02:32:47Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894424#M20842</link>
      <description>&lt;P&gt;Do NOT use monotonic()! This is an undocumented and unsupported function and it will not necessarily return the expected result especially when used with a database table as source.&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 03:41:12 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894424#M20842</guid>
      <dc:creator>Patrick</dc:creator>
      <dc:date>2023-09-15T03:41:12Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894460#M20843</link>
      <description>&lt;P&gt;Here contains log details. SAS is reading 80k records but why it's considering as nulls even though data is present?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="samanvi_1-1694778246059.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/88029i39E1F9C860C08F41/image-size/medium?v=v2&amp;amp;px=400" role="button" title="samanvi_1-1694778246059.png" alt="samanvi_1-1694778246059.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="samanvi_2-1694778274979.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/88030i34908B7533AF94D9/image-size/medium?v=v2&amp;amp;px=400" role="button" title="samanvi_2-1694778274979.png" alt="samanvi_2-1694778274979.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="samanvi_3-1694778339998.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/88031iACA60A4B1D4E6840/image-size/medium?v=v2&amp;amp;px=400" role="button" title="samanvi_3-1694778339998.png" alt="samanvi_3-1694778339998.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 11:51:40 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894460#M20843</guid>
      <dc:creator>samanvi</dc:creator>
      <dc:date>2023-09-15T11:51:40Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894592#M20844</link>
      <description>&lt;P&gt;Does your program work with a small amount of data? Try just loading say 1000 rows. If that works try scaling up to 10K, 50K, 100K.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Also please don't screenshot your SAS logs, Just do a normal copy and paste using the &amp;lt;/&amp;gt; menu option.&lt;/P&gt;</description>
      <pubDate>Fri, 15 Sep 2023 23:53:08 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894592#M20844</guid>
      <dc:creator>SASKiwi</dc:creator>
      <dc:date>2023-09-15T23:53:08Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894768#M20845</link>
      <description>&lt;P&gt;For me, best practice is not to use FORCE.&lt;/P&gt;
&lt;P&gt;You should have control of your ETL process, and explicitly tell what data should go where. Don't allow any warnings in your log for production pipelines.&lt;/P&gt;
&lt;P&gt;Second, instead of monotonic(), consider using Snowflakes AUTOINCREMENT instead.&lt;/P&gt;
&lt;P&gt;&lt;A href="https://docs.snowflake.com/en/sql-reference/sql/create-table#syntax" target="_blank"&gt;https://docs.snowflake.com/en/sql-reference/sql/create-table#syntax&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;You still haven't share the compete log with your program (including libname and option).&lt;/P&gt;</description>
      <pubDate>Mon, 18 Sep 2023 12:09:38 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894768#M20845</guid>
      <dc:creator>LinusH</dc:creator>
      <dc:date>2023-09-18T12:09:38Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894788#M20846</link>
      <description>&lt;P&gt;Just curious, are you using SAS/ACCESS to Snowflake or perhap's using Snowflake's ODBC driver?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Agree with earlier suggestion to try pushing just 5 records (or even 1 record) to snowflake. Are you able to reliably query data from snowflake into SAS?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;What happens if you try to create a new table in Snowflake, instead of append to an existing table, e.g.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;proc append base=abc.NewSnowflakeTable data=AFFIL_1 (obs=5);
run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;The ? in the log don't mean the values are null, so I wouldn't worry about them.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Googling the error message for 400 errors from snowflake turns up plenty of hits, e.g.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://community.snowflake.com/s/article/Solution-400-Bad-Request-Login-Errors-using-SAML-SSO-Federation-with-Snowflake" target="_blank"&gt;https://community.snowflake.com/s/article/Solution-400-Bad-Request-Login-Errors-using-SAML-SSO-Federation-with-Snowflake&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://www.googlecloudcommunity.com/gc/Technical-Tips-Tricks/SnowflakeSQLException-JDBC-Driver-encountered-communication/ta-p/586828" target="_blank"&gt;https://www.googlecloudcommunity.com/gc/Technical-Tips-Tricks/SnowflakeSQLException-JDBC-Driver-encountered-communication/ta-p/586828&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 18 Sep 2023 14:22:03 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894788#M20846</guid>
      <dc:creator>Quentin</dc:creator>
      <dc:date>2023-09-18T14:22:03Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894885#M20847</link>
      <description>&lt;P&gt;Another source of information could be the log in Snowsight.&lt;/P&gt;
&lt;P&gt;What do you see there?&lt;/P&gt;</description>
      <pubDate>Tue, 19 Sep 2023 07:05:04 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/894885#M20847</guid>
      <dc:creator>LinusH</dc:creator>
      <dc:date>2023-09-19T07:05:04Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895073#M20849</link>
      <description>&lt;P&gt;Thank you for all your help. We analyzed almost 59mill records and found some specials char which is coming at end of data&amp;nbsp; and we are trying to insert into one of the column due to this&amp;nbsp; getting error. we suppressed that and data is loading into the table.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;But loading of this 59Mill is taking a lot of time. for every 1 Mill it's taking 20 min. We have 55 columns only.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Sep 2023 15:11:27 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895073#M20849</guid>
      <dc:creator>samanvi</dc:creator>
      <dc:date>2023-09-20T15:11:27Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895078#M20850</link>
      <description>&lt;P&gt;Sounds like you have some truncation going on.&amp;nbsp; Perhaps you are transcoding the data from single byte encoding to multi-byte encoding so the length of the target variables is too short and some mult-byte characters are being truncation causing them to be invalid.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;for improved transfer use bulk loading.&amp;nbsp; If your driver does not support it then roll your own by exporting to a delimited file and then using the Snowflake command (COPY FROM?)&amp;nbsp;to bulk load the file.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Sep 2023 15:16:29 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895078#M20850</guid>
      <dc:creator>Tom</dc:creator>
      <dc:date>2023-09-20T15:16:29Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895081#M20851</link>
      <description>&lt;P&gt;Tried to use bulk load in libname but we got problem if I have 2 columns&amp;nbsp; with 1st is numeric&amp;nbsp; which has&amp;nbsp; nulls/nodata&amp;nbsp; and 2nd with data(char). the second column is moving it's position to left. In this case the character column is moving to numeric side and getting failed with error data type mismatch.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Sep 2023 15:28:58 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895081#M20851</guid>
      <dc:creator>samanvi</dc:creator>
      <dc:date>2023-09-20T15:28:58Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895082#M20852</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;HR /&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/368795"&gt;@samanvi&lt;/a&gt;&amp;nbsp;wrote:&lt;BR /&gt;
&lt;P&gt;Tried to use bulk load in libname but we got problem if I have 2 columns&amp;nbsp; with 1st is numeric&amp;nbsp; which has&amp;nbsp; nulls/nodata&amp;nbsp; and 2nd with data(char). the second column is moving it's position to left. In this case the character column is moving to numeric side and getting failed with error data type mismatch.&lt;/P&gt;
&lt;HR /&gt;&lt;/BLOCKQUOTE&gt;
&lt;P&gt;Using bulkload options from SAS?&amp;nbsp; Raise a support ticket with SAS.&lt;/P&gt;
&lt;P&gt;Writing your own code?&amp;nbsp; Sounds like somewhere you forgot to tell SNOWFLAKE to use the equivalent of the DSD option of the FILE or INFILE statement in SAS.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Sep 2023 15:31:26 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895082#M20852</guid>
      <dc:creator>Tom</dc:creator>
      <dc:date>2023-09-20T15:31:26Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895098#M20853</link>
      <description>&lt;P&gt;We are just remediating the existing codes. Our source is SQL server and creating a physical key and loading them to the snowflake table. SAS ia acting as bridge to load data.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Sep 2023 16:59:56 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895098#M20853</guid>
      <dc:creator>samanvi</dc:creator>
      <dc:date>2023-09-20T16:59:56Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895151#M20854</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;HR /&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/368795"&gt;@samanvi&lt;/a&gt;&amp;nbsp;wrote:&lt;BR /&gt;
&lt;P&gt;...&lt;/P&gt;
&lt;P&gt;But loading of this 59Mill is taking a lot of time. for every 1 Mill it's taking 20 min. We have 55 columns only.&lt;/P&gt;
&lt;HR /&gt;&lt;/BLOCKQUOTE&gt;
&lt;P&gt;Look into libname options readbuff, insertbuff, and dbcommit&lt;/P&gt;</description>
      <pubDate>Thu, 21 Sep 2023 00:15:09 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/895151#M20854</guid>
      <dc:creator>Patrick</dc:creator>
      <dc:date>2023-09-21T00:15:09Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data into snowflake table</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/943108#M20999</link>
      <description>&lt;P&gt;Hi there! It sounds like you're dealing with a significant volume of data. For large-scale data migrations like pulling data from SQL Server into Snowflake, using the appropriate approach can make a big difference. You might want to explore a step-by-step guide on how to load data from SQL Server to Snowflake, which also highlights best practices to avoid common errors. Here's a comprehensive guide: &lt;STRONG&gt;&lt;A href="https://hevodata.com/learn/how-to-load-data-from-sql-server-to-snowflake/" target="_self"&gt;SQL Server to Snowflake&lt;/A&gt;&lt;/STRONG&gt;. It may help you resolve the CLI and data type issues you're encountering.&lt;/P&gt;</description>
      <pubDate>Mon, 09 Sep 2024 08:53:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/Loading-huge-data-into-snowflake-table/m-p/943108#M20999</guid>
      <dc:creator>samrat1507</dc:creator>
      <dc:date>2024-09-09T08:53:05Z</dc:date>
    </item>
  </channel>
</rss>

