<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How to use bulkload option for SAS/Interface to Snowflake in SAS Data Management</title>
    <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-bulkload-option-for-SAS-Interface-to-Snowflake/m-p/617749#M18671</link>
    <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could someone give sample&amp;nbsp;code snippet to use bulkload option for SAS/Access interface to Snowflake.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am able to connect to Snow flake successfully using libname statement.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;libname bulk sasiosnf dsn='snowflake_dev' user=xxxxxx&lt;/P&gt;&lt;DIV class="sasSource"&gt;74 pwd=XXXXXXXXXXXX schema=xxxxx' role=xxxxx warehouse=xxxxx bulkload=yes&lt;/DIV&gt;&lt;DIV class="sasSource"&gt;74 ! bulkunload=yes bl_internal_stage="user/test1";&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;NOTE: Libref BULK was successfully assigned as follows:&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;Engine: SASIOSNF&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;Physical Name: snowflake_dev&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;proc sql;&lt;DIV class="sasSource"&gt;74 connect using bulk;&lt;/DIV&gt;&lt;DIV class="sasSource"&gt;75 PUT file:///data/user_data/hari_thatavarthy/SASTABLE2.csv @~/staged;&lt;/DIV&gt;&lt;DIV class="sasError"&gt;___&lt;/DIV&gt;&lt;DIV class="sasError"&gt;180&lt;/DIV&gt;&lt;DIV class="sasError"&gt;ERROR 180-322: Statement is not valid or it is used out of proper order.&lt;/DIV&gt;&lt;DIV class="sasSource"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;NOTE: PROC SQL set option NOEXEC and will continue to check the syntax of statements.&lt;/DIV&gt;&lt;DIV class="sasSource"&gt;76 COPY INTO mytable from @~/staged FILE_FORMAT = (FORMAT_NAME = 'my_csv_format');&lt;/DIV&gt;&lt;DIV class="sasError"&gt;____&lt;/DIV&gt;&lt;DIV class="sasError"&gt;180&lt;/DIV&gt;&lt;DIV class="sasError"&gt;ERROR 180-322: Statement is not valid or it is used out of proper order.&lt;/DIV&gt;&lt;DIV class="sasError"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasError"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasError"&gt;Getting above error.&lt;/DIV&gt;&lt;DIV class="sasError"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasError"&gt;I am not sure , how to use PUT and COpy command. Could someone help me on this for Bulkload.&lt;/DIV&gt;&lt;/DIV&gt;</description>
    <pubDate>Thu, 16 Jan 2020 13:32:49 GMT</pubDate>
    <dc:creator>karthic2612</dc:creator>
    <dc:date>2020-01-16T13:32:49Z</dc:date>
    <item>
      <title>How to use bulkload option for SAS/Interface to Snowflake</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-bulkload-option-for-SAS-Interface-to-Snowflake/m-p/617749#M18671</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could someone give sample&amp;nbsp;code snippet to use bulkload option for SAS/Access interface to Snowflake.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am able to connect to Snow flake successfully using libname statement.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;libname bulk sasiosnf dsn='snowflake_dev' user=xxxxxx&lt;/P&gt;&lt;DIV class="sasSource"&gt;74 pwd=XXXXXXXXXXXX schema=xxxxx' role=xxxxx warehouse=xxxxx bulkload=yes&lt;/DIV&gt;&lt;DIV class="sasSource"&gt;74 ! bulkunload=yes bl_internal_stage="user/test1";&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;NOTE: Libref BULK was successfully assigned as follows:&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;Engine: SASIOSNF&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;Physical Name: snowflake_dev&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;proc sql;&lt;DIV class="sasSource"&gt;74 connect using bulk;&lt;/DIV&gt;&lt;DIV class="sasSource"&gt;75 PUT file:///data/user_data/hari_thatavarthy/SASTABLE2.csv @~/staged;&lt;/DIV&gt;&lt;DIV class="sasError"&gt;___&lt;/DIV&gt;&lt;DIV class="sasError"&gt;180&lt;/DIV&gt;&lt;DIV class="sasError"&gt;ERROR 180-322: Statement is not valid or it is used out of proper order.&lt;/DIV&gt;&lt;DIV class="sasSource"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasNote"&gt;NOTE: PROC SQL set option NOEXEC and will continue to check the syntax of statements.&lt;/DIV&gt;&lt;DIV class="sasSource"&gt;76 COPY INTO mytable from @~/staged FILE_FORMAT = (FORMAT_NAME = 'my_csv_format');&lt;/DIV&gt;&lt;DIV class="sasError"&gt;____&lt;/DIV&gt;&lt;DIV class="sasError"&gt;180&lt;/DIV&gt;&lt;DIV class="sasError"&gt;ERROR 180-322: Statement is not valid or it is used out of proper order.&lt;/DIV&gt;&lt;DIV class="sasError"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasError"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasError"&gt;Getting above error.&lt;/DIV&gt;&lt;DIV class="sasError"&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class="sasError"&gt;I am not sure , how to use PUT and COpy command. Could someone help me on this for Bulkload.&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Thu, 16 Jan 2020 13:32:49 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-bulkload-option-for-SAS-Interface-to-Snowflake/m-p/617749#M18671</guid>
      <dc:creator>karthic2612</dc:creator>
      <dc:date>2020-01-16T13:32:49Z</dc:date>
    </item>
    <item>
      <title>Re: How to use bulkload option for SAS/Interface to Snowflake</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-bulkload-option-for-SAS-Interface-to-Snowflake/m-p/618180#M18672</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/223640"&gt;@karthic2612&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;It just so happens that I am writing an SGF paper on SAS/ACCESS Interface to Snowflake. I am using the following code to test some ideas. Keep in mind, you will need to create STAGES for some of this code to work. Check out the Snowflake DDL doc for details. The S3 example requires a valid bucket.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-sas"&gt;/* Snowflake SGF Paper - Experiments - Jeff Bailey */
/* Test the various Snowflake load techniques with largish data */
/* 1 GB */

/* Generate non-trivial data from performance experiments */
/* 4.5M = 1 GB */

data work.large_table_1GB;
   format dt date.;

   do i = 1 to 4500000;
      dt = i + 1;
      mynum1 = i;
      text1  = "abcdefghijklmnopqrstuvwxyz0123456789";
      mynum2 = i * 10;
	  text2  = "abcdefghijklmnopqrstuvwxyz0123456789";
      mynum3 = i * 10;
      text3  = "abcdefghijklmnopqrstuvwxyz0123456789";
      mynum4 = i * 10;
      text4  = "abcdefghijklmnopqrstuvwxyz0123456789";
      mynum5 = i * 10;
	  text5  = "abcdefghijklmnopqrstuvwxyz0123456789";
	  mynum6 = i * 10; 
      output;
	end;
run;

/* If you want to do inserts add INSERT=32000 and DBCOMMIT=0 to LIBNAME statement */
libname snow SNOW server="something.snowflakecomputing.com" 
db=TESTDB
warehouse=TESTWH
schema=THISUSER 
user=THISUSER 
pw=MyPassword123! 
preserve_tab_names=no;

/* Get a baseline for performance */
data snow.large_table_1GB;
   set large_table_1GB;
run;

options sastrace=',,,d' sastraceloc=saslog nostsuffix;

/* Load via a Snowflake internal named stage */
/* Must create an internal stage in Snowflake (via SQL) for this to work. */
data snow.large_table_1GB_named_stage (bulkload=yes bl_stage="MY_STAGE" BL_COMPRESS=yes);
   set large_table_1GB;
run;

/* Load via a Snowflake table stage */
data snow.large_table_1GB_table_stage (bulkload=yes bl_internal_stage="table/sometable" BL_COMPRESS=yes);
   set large_table_1GB;
run;

/* Load via a Snowflake user stage */
data snow.large_table_1GB_user_stage (bulkload=yes bl_internal_stage="user/someuser" BL_COMPRESS=yes);
   set large_table_1GB;
run;

/* Load via an S3 bucket */
/* Must create an S3 bucket for this to work */
/* Must ssh into the Linux machine where SAS is running and obtain new credentials - kinit, getaswkey */
data snow.large_table_1GB_s3_stage (bulkload=yes bl_bucket="snowbulk-mybucket" bl_compress=yes);
   set large_table_1GB;
run;&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;Good luck,&lt;BR /&gt;Jeff&lt;/P&gt;</description>
      <pubDate>Fri, 17 Jan 2020 19:17:02 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-bulkload-option-for-SAS-Interface-to-Snowflake/m-p/618180#M18672</guid>
      <dc:creator>JBailey</dc:creator>
      <dc:date>2020-01-17T19:17:02Z</dc:date>
    </item>
    <item>
      <title>Re: How to use bulkload option for SAS/Interface to Snowflake</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-bulkload-option-for-SAS-Interface-to-Snowflake/m-p/618519#M18673</link>
      <description>&lt;P&gt;Looking forward to the presentation&amp;nbsp;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/51161"&gt;@JBailey&lt;/a&gt;&amp;nbsp;!&lt;/P&gt;
&lt;P&gt;But do I undrestand it right, that SAS doesn't set up a Snowflake stage on the fly in the bulk-load process...?&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jan 2020 13:46:14 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-bulkload-option-for-SAS-Interface-to-Snowflake/m-p/618519#M18673</guid>
      <dc:creator>LinusH</dc:creator>
      <dc:date>2020-01-20T13:46:14Z</dc:date>
    </item>
    <item>
      <title>Re: How to use bulkload option for SAS/Interface to Snowflake</title>
      <link>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-bulkload-option-for-SAS-Interface-to-Snowflake/m-p/618549#M18674</link>
      <description>&lt;P&gt;Hi &lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/13674"&gt;@LinusH&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;It depends on the STAGE. Table and User Stages are always present in Snowflake. Internal Named Stages must be created by the DBA, or someone with proper privileges. Configuring S3 as a Stage requires AWS configuration. It is a very interesting topic; I am beginning to write this section of the paper today.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&amp;lt;things_say_every_time_I_write_an_SGF_paper&amp;gt;&lt;/STRONG&gt;I swear this is the last SGF paper I will ever write.&lt;STRONG&gt;&amp;lt;/things_say_every_time_I_write_an_SGF_paper&amp;gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Best wishes,&lt;/P&gt;
&lt;P&gt;Jeff&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jan 2020 14:24:38 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Data-Management/How-to-use-bulkload-option-for-SAS-Interface-to-Snowflake/m-p/618549#M18674</guid>
      <dc:creator>JBailey</dc:creator>
      <dc:date>2020-01-20T14:24:38Z</dc:date>
    </item>
  </channel>
</rss>

