<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Snowflake temp table to SAS dataset in SAS Programming</title>
    <link>https://communities.sas.com/t5/SAS-Programming/Snowflake-temp-table-to-SAS-dataset/m-p/980635#M378966</link>
    <description>&lt;P&gt;Not sure about Snowflake but in other databases that allow temporary table they usually appear in a separate schema.&amp;nbsp; Usually based on the username of the database user.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Once you figure out if that is true for Snowflake and what schema it uses try adding the &lt;A href="https://documentation.sas.com/doc/en/pgmsascdc/9.4_3.5/acreldb/n1h1m1ethqxwryn1wu68m46hdp1u.htm" target="_self"&gt;SCHEMA= dataset option&lt;/A&gt; to your SAS code that tries to use the bulk load options.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;And if you do want to try to make two librefs see if you can make them using CONNECTION=GLOBAL on both libref definitions.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Thu, 11 Dec 2025 14:14:15 GMT</pubDate>
    <dc:creator>Tom</dc:creator>
    <dc:date>2025-12-11T14:14:15Z</dc:date>
    <item>
      <title>Snowflake temp table to SAS dataset</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Snowflake-temp-table-to-SAS-dataset/m-p/980631#M378964</link>
      <description>&lt;P&gt;Hi all,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I’m working on fetching large tables from Snowflake into SAS WORK datasets. Here’s my situation:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We have a pre-assigned SAS library (snow) for Snowflake access, with all the following settings:&lt;/P&gt;
&lt;P&gt;BULKLOAD=YES&amp;nbsp;BULKUNLOAD=YES&amp;nbsp;BL_INTERNAL_STAGE='@~'&amp;nbsp;BL_DELETE_DATAFILE=YES&amp;nbsp;BL_COMPRESS=YES&amp;nbsp;BL_USE_ESCAPE=YES&amp;nbsp;BL_NUM_READ_THREADS=8 /* 8 CPUs on a node */&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;This library does not define database or schema, and all developers use this shared library.&lt;/P&gt;
&lt;P&gt;I need to fetch data from a large Snowflake table (~200M rows, 7–8 columns).&lt;/P&gt;
&lt;P&gt;I want to use BULKUNLOAD (with BL_NUM_DATA_FILES) to speed up the fetch, but I’m running into some limitations.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Current Pass-Through Approach&lt;/P&gt;
&lt;PRE&gt;proc sql;
    connect using snow as snow;
    execute (use role SF_Role);
    execute (use warehouse M_VWH);
    create table work.test as
    select columnA, columnB
    from connection to snow
        (select * from database.schema.snowflake_table);
    disconnect from snow;
quit;
&lt;/PRE&gt;
&lt;P&gt;&lt;BR /&gt;&lt;STRONG&gt;Observations:&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The Snowflake pass through query executes quickly (~8 seconds), but fetching into SAS WORK takes 13+ minutes.&lt;/P&gt;
&lt;P&gt;Because I’m using PROC SQL pass-through, the BULKUNLOAD and internal stage mechanism is not used.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Using LIBNAME + DATA Step Works Fast&lt;/P&gt;
&lt;PRE&gt;libname sf_bulk snow dsn=snowflake warehouse=M_VWH role=SF_Role
database=db schema=Schema authdomain="Snowflake"
BULKLOAD=YES BULKUNLOAD=YES
BL_INTERNAL_STAGE='@~'
BL_DELETE_DATAFILE=YES
BL_COMPRESS=YES
BL_USE_ESCAPE=YES
BL_NUM_READ_THREADS=8;

data work.test;
set sf_bulk.snowflake_table;
run;&lt;/PRE&gt;
&lt;P&gt;This approach uses BULKUNLOAD and internal stage, and the fetch completes in ~4 minutes.&lt;/P&gt;
&lt;P&gt;I verified in Snowflake using list @~ that intermediate bulk files were created.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I cannot switch all code to DATA step because pass-through is needed for transformation logic inside Snowflake.&lt;/P&gt;
&lt;P&gt;So I thought of creating a temporary table in Snowflake with all the transformations, then fetch it via SAS:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;libname sf_temp snow dsn=snowflake warehouse=M_VWH role=SF_Role
database=db schema=Schema authdomain="Snowflake"
BULKLOAD=YES BULKUNLOAD=YES
BL_INTERNAL_STAGE='@~'
BL_DELETE_DATAFILE=YES
BL_COMPRESS=YES
BL_USE_ESCAPE=YES
BL_NUM_READ_THREADS=8;

proc sql;
connect using sf_temp as sf_temp;
execute (
create temporary table temp_mytable as
select columnA, columnB
from database.schema.snowflake_table
) by sf_temp;

create table work.test as
select *
from connection to sf_temp
(select * from database.schema.temp_mytable);
disconnect from sf_temp;
quit;

&lt;/PRE&gt;
&lt;P&gt;&lt;BR /&gt;Observations:&lt;/P&gt;
&lt;P&gt;This works, but BULKUNLOAD and internal stage are not used, so performance is slow.&lt;/P&gt;
&lt;P&gt;I tried using data step in between , but it got failed with no temp table available.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Advice Needed&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;1. Is there any way to use BULKUNLOAD / internal stage with a temporary table via SAS/ACCESS?&lt;/P&gt;
&lt;P&gt;2. Would DBMSTEMP= help in this scenario, or is it unsupported with Snowflake?&lt;/P&gt;
&lt;P&gt;3. Are there any best practices for using temp tables and bulk fetch with pre-assigned libraries?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Please let me know if you have any solution for this. Thank you..&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 11 Dec 2025 12:22:39 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Snowflake-temp-table-to-SAS-dataset/m-p/980631#M378964</guid>
      <dc:creator>freshstarter</dc:creator>
      <dc:date>2025-12-11T12:22:39Z</dc:date>
    </item>
    <item>
      <title>Re: Snowflake temp table to SAS dataset</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Snowflake-temp-table-to-SAS-dataset/m-p/980635#M378966</link>
      <description>&lt;P&gt;Not sure about Snowflake but in other databases that allow temporary table they usually appear in a separate schema.&amp;nbsp; Usually based on the username of the database user.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Once you figure out if that is true for Snowflake and what schema it uses try adding the &lt;A href="https://documentation.sas.com/doc/en/pgmsascdc/9.4_3.5/acreldb/n1h1m1ethqxwryn1wu68m46hdp1u.htm" target="_self"&gt;SCHEMA= dataset option&lt;/A&gt; to your SAS code that tries to use the bulk load options.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;And if you do want to try to make two librefs see if you can make them using CONNECTION=GLOBAL on both libref definitions.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 11 Dec 2025 14:14:15 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Snowflake-temp-table-to-SAS-dataset/m-p/980635#M378966</guid>
      <dc:creator>Tom</dc:creator>
      <dc:date>2025-12-11T14:14:15Z</dc:date>
    </item>
  </channel>
</rss>

