<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>AndrewZ Tracker</title>
    <link>https://communities.sas.com/kntur85557/tracker</link>
    <description>AndrewZ Tracker</description>
    <pubDate>Thu, 30 Apr 2026 14:53:26 GMT</pubDate>
    <dc:date>2026-04-30T14:53:26Z</dc:date>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/986857#M379997</link>
      <description>SASKiki: thanks for the tip, but our license is for "SAS/ACCESS Interface to ODBC" which I think doesn't cover a Snowflake-specific connector. Instead, it's a general ODBC connector that we also use for Microsoft SQL, MySQL, Excel, Access, etc.</description>
      <pubDate>Mon, 27 Apr 2026 21:36:26 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/986857#M379997</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2026-04-27T21:36:26Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/986845#M379995</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/481360"&gt;@artisan88&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you for your post.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As a follow up to my earlier post, nothing has changed: SAS with ODBC continues to not work with Unicode. Your workaround to use pass-through makes sense, and I considered doing something like that.&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;For speed reasons, I already set up a similar system to bulk load from SAS to Snowflake, but it supports Unicode too. I make a local TSV or JSON file. Then I use a pass-through with&amp;nbsp;&lt;SPAN&gt;PUT to load it into Snowflake, then make a normal remote table. It sounds like what you do, but in reverse, but I do not use a pre-signed URL or PROC HTTP.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;I worked through my own quirks, like mapping the data types, chunking large files, adding a count verification step, a bypass option for tiny tables, and options to either create or append a table.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;For my jobs most affected by SAS's Unicode issues with Snowflake, in the future I will be moving them to either use Python or by keeping processing within Snowflake.&lt;/P&gt;</description>
      <pubDate>Mon, 27 Apr 2026 18:55:46 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/986845#M379995</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2026-04-27T18:55:46Z</dc:date>
    </item>
    <item>
      <title>Re: Develop corpus of SAS coding data to train LLM</title>
      <link>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idc-p/957747#M499</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/13884"&gt;@ballardw&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Common examples of LLMs are ChatGPT (GPT=Generative pre-trained transformer), Anthronic Claude, Google Gemini, and Meta Llama. The name LLaMa itself is a sort of pun of LLM.&amp;nbsp; People often use LLMs to generate code, autocomplete when coding, comment code, and answer questions about code.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Why does it matter? Many coders have their favorite coding LLMs and favorite coding tools (e.g., &lt;A href="https://developer.sas.com/programming/vs_code_extension" target="_self"&gt;Visual Studio&lt;/A&gt;), and an important part of my suggestion is to let coders continue to use their favorite LLMs and favorite coding tools for SAS. They may not want to be locked in to SAS Viya and SAS's LLM.&amp;nbsp;&amp;nbsp;Widespread support for SAS across LLMs would continue to help SAS thrive (e.g., increasing productivity of seasoned SAS coding, making it easier to onboard new staff to SAS), so at a deeper level, my goal is for SAS to not decline as a language or as a company. It's a complex issue, but it's a layer of protection against users turning to alternatives. It's relevant as LLM-driven coding is at the beginning of a boom, while SAS is on the sidelines.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;An important TLA here is SAS: &lt;U&gt;S&lt;/U&gt;emicolon, &lt;U&gt;A&lt;/U&gt;lways &lt;U&gt;S&lt;/U&gt;emicolon.... j/k &lt;span class="lia-unicode-emoji" title=":rolling_on_the_floor_laughing:"&gt;🤣&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 30 Jan 2025 23:48:17 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idc-p/957747#M499</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2025-01-30T23:48:17Z</dc:date>
    </item>
    <item>
      <title>Re: Develop corpus of SAS coding data to train LLM</title>
      <link>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idc-p/957708#M496</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;P&gt;&lt;SPAN class=""&gt;Status changed to:&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://communities.sas.com/t5/SAS-Product-Suggestions/idb-p/product-suggestions/status-key/delivered" target="_blank" rel="noopener"&gt;Suggestion Implemented&lt;/A&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;/BLOCKQUOTE&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;It's not implemented until the training data is shared with non-SAS LLMs like those by Meta, OpenAI, and Claude. I suggest posting the data to Huggingface.&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Thu, 30 Jan 2025 19:12:35 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idc-p/957708#M496</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2025-01-30T19:12:35Z</dc:date>
    </item>
    <item>
      <title>Re: Develop corpus of SAS coding data to train LLM</title>
      <link>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idc-p/957707#M495</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/308484"&gt;@jleirer&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;A proprietary LLM is not a great solution. The existing LLMs are smarter in general, moving at a facer pace, and integrated into a variety of tools that developers already like.&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;We don't use SAS Viya anymore, and we're not getting any more SAS licenses. Our team has been using SAS since the 1990s, but next year, IT plans to not renew the SAS contract because of high costs. I'd like to keep the SAS licenses, but I'm not sure that's an option.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Also, our IT department makes it almost hard to get new AI tools approved, so our SAS license will have expired before the SAS copilot software is approved.&lt;/P&gt;</description>
      <pubDate>Thu, 30 Jan 2025 19:10:02 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idc-p/957707#M495</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2025-01-30T19:10:02Z</dc:date>
    </item>
    <item>
      <title>Re: Develop corpus of SAS coding data to train LLM</title>
      <link>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idc-p/957184#M487</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/223320"&gt;@quickbluefish&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Yes &lt;span class="lia-unicode-emoji" title=":hundred_points:"&gt;💯&lt;/span&gt;! The consequences affect this issue several ways. First, there are fewer SAS users overall, reducing organic SAS-related on the open internet, so there is less training data. Second, because of a lack of popularity, SAS does not register as a priority, even as second-tier language like PHP, in LLM development. Third, even if LLM developers wanted to better support SAS, they couldn't run SAS to follow the data synthesis process outlined in the &lt;A href="https://arxiv.org/pdf/2407.21783" target="_self"&gt;Llama paper&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I've been using SAS daily for about 15 years.&lt;/P&gt;</description>
      <pubDate>Fri, 24 Jan 2025 21:14:32 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idc-p/957184#M487</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2025-01-24T21:14:32Z</dc:date>
    </item>
    <item>
      <title>Develop corpus of SAS coding data to train LLM</title>
      <link>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idi-p/957178</link>
      <description>&lt;P&gt;Increasingly, software developers and data scientists rely on LLMs to help with coding, but&amp;nbsp;LLMs are poor at SAS coding. Hallucinations are common, and the LLM-generated SAS code often does not run without major changes. This combination of circumstances may lead to poor outcomes, such as SAS coders who may turn to alternatives like Python which have top-tier support in LLMs, whether used in the traditional dialog format or embedded in coding assistants such as GitHub Coplot and Windsurf.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The academic paper "The Llama 3 Herd of Models" in section 4.3.1&amp;nbsp;lists their top 10 top-tier languages (notably, not including SAS), and the paper details how Meta improved the ability of LLMs to generate better code. &lt;STRONG&gt;One potential solution to LLMs' struggle with SAS coding is for SAS to emulate Meta's approach by developing a corpus of SAS-specific training data that all LLMs can freely use&lt;/STRONG&gt;. Then, SAS could publish this data set on Hugging Face and promote it to Meta, OpenAI, Google, and Claude.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The Llama paper gives a template for this process. In the case of SAS, coding questions and solutions could be automatically collected&amp;nbsp;from resources such as the SAS documentation (example code), this SAS forum, StackOverflow, SAS support cases, and SAS blogs (to the extent permissible by copyrights, licenses, and ToS).&amp;nbsp; Some strategies in the Llama paper: remove PII, automatically evaluate by LLMs, automatically write unit tests, automatically testing solutions in sandbox environments.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;An intriguing strategy would be to develop a list of Python data science and business intelligence question, and then translate the solutions to SAS. This assumes that coders in each language are facing similar questions, but the Python questions are more abundant on the open Internet.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 24 Jan 2025 20:11:07 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Product-Suggestions/Develop-corpus-of-SAS-coding-data-to-train-LLM/idi-p/957178</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2025-01-24T20:11:07Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942590#M369605</link>
      <description>&lt;P&gt;Our organization has been on Snowflake since 2020, and we use both 64-bit and 32-bit SAS, so we have tested many combinations of SAS versions, driver versions, and bitness, but Unicode never worked once with SAS and Snowflake.&lt;/P&gt;</description>
      <pubDate>Wed, 04 Sep 2024 19:45:05 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942590#M369605</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2024-09-04T19:45:05Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942544#M369588</link>
      <description>&lt;P&gt;Yes, I checked this is SAS Unicode, then double checked, triple checked, and then checked a few more times.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here is a screenshot&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="AndrewZ_0-1725473939156.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/99957i460FA06AADA8BDDE/image-size/medium?v=v2&amp;amp;px=400" role="button" title="AndrewZ_0-1725473939156.png" alt="AndrewZ_0-1725473939156.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 04 Sep 2024 18:19:18 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942544#M369588</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2024-09-04T18:19:18Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942530#M369580</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/13868"&gt;@AhmedAl_Attar&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In Snowflake, I increased length to varchar(1000), and still in Snowflake, and I calculated len(text_sample). This screenshot of SAS shows maximum length of any string was 49, but in SAS, the text was still corrupt.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="AndrewZ_0-1725469824143.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/99953i7D4B96BEEF47D90E/image-size/medium?v=v2&amp;amp;px=400" role="button" title="AndrewZ_0-1725469824143.png" alt="AndrewZ_0-1725469824143.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 04 Sep 2024 17:11:46 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942530#M369580</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2024-09-04T17:11:46Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942479#M369558</link>
      <description>&lt;P&gt;Here are screenshots from yesterday comparing SAS Unicode mode and Excel reading the same Snowflake table via the Snowflake ODBC driver. (Excel used a DSN, while in SAS Unicode mode I tested both DSN and connection string with no difference in results.) Characters such as CJK, Russian, and Thai show as boxes in SAS.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="AndrewZ_0-1725459705709.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/99949i435D12C42829C98F/image-size/medium?v=v2&amp;amp;px=400" role="button" title="AndrewZ_0-1725459705709.png" alt="AndrewZ_0-1725459705709.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="AndrewZ_1-1725459711905.png" style="width: 400px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/99950i26D51750B6989660/image-size/medium?v=v2&amp;amp;px=400" role="button" title="AndrewZ_1-1725459711905.png" alt="AndrewZ_1-1725459711905.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Yesterday I also tested Python with pyodbc to Snowflake via DSN, and the result was correct.&lt;/P&gt;&lt;P&gt;A year ago I tested the ODBC Utility in Microsoft's MDAC, but the Unicode was corrupt. A year ago, I also tested Excel, and the Unicode was corrupt. I am not sure what changed for Excel, but it worked better yesterday than a year ago.&lt;/P&gt;</description>
      <pubDate>Wed, 04 Sep 2024 14:24:42 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942479#M369558</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2024-09-04T14:24:42Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942385#M369528</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/19879"&gt;@Quentin&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;A year ago I tested with the ODBC Test utility found in MDAC, but the Unicode was corrupt, so I concluded SAS could not do any better. Today I tested with Excel and Python both connected to an ODBC DSN, and the Unicode looked great, so I will try again with SAS support.&lt;/P&gt;</description>
      <pubDate>Tue, 03 Sep 2024 20:44:12 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/942385#M369528</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2024-09-03T20:44:12Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/936218#M368029</link>
      <description>&lt;P&gt;Quentin, the iso-8859-1 is not a great workaround for me either.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If you are desperate, an ugly hack is to use a pass-through query to tell Snowflake to write to a text file: see &lt;A href="https://docs.snowflake.com/en/sql-reference/sql/copy-into-location" target="_self"&gt;COPY INTO location&lt;/A&gt;&amp;nbsp;and &lt;A href="https://docs.snowflake.com/en/sql-reference/sql/get" target="_self"&gt;GET&lt;/A&gt;. Then in SAS use a regular PROC IMPORT. I haven't tried this exactly, but I do the exact opposite to bulk load large data sets quickly from SAS to Snowflake.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If you have access to Snowflake support (yourself or via IT), file a ticket with Snowflake.&lt;/P&gt;</description>
      <pubDate>Thu, 18 Jul 2024 16:05:25 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/936218#M368029</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2024-07-18T16:05:25Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/936094#M367982</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;P&gt;Did you look at the hexcodes in the dataset?&amp;nbsp; Were they the valid UTF-8 bytes you expected?&amp;nbsp;&lt;/P&gt;&lt;/BLOCKQUOTE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;You mean use a hex editor on the .sas7bdat file? No. Based on my other tests (like one in the next paragraph), Snowflake was not sending UTF-8.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;BLOCKQUOTE&gt;&lt;P&gt;&lt;SPAN&gt;That PROC DATASETS code will just change the metadata attribute that indicates the encoding used to create the file.&amp;nbsp; Changing the metadata about the encoding of the text in the dataset will not change what is in the dataset.&amp;nbsp; It just tells future users of the data what to expect to find when they look at the data.&lt;/SPAN&gt;&lt;/P&gt;&lt;/BLOCKQUOTE&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class=""&gt;Yes, I understand. The PROC DATASETS step fixed encoding for Spanish&amp;nbsp;characters (like á, é, í, ó, ú, ñ), German characters ( like ä, ö, ü), and "smart" quotation marks usually made by Microsoft Office, but not other texts like Korean, so that implies Snowflake ODBC was sending the text as iso-8859-1 instead of utf-8.&lt;/DIV&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class=""&gt;If Snowflake ODBC were sending UTF-8, PROC DATASETS would not have this effect in SAS.&lt;/DIV&gt;</description>
      <pubDate>Wed, 17 Jul 2024 20:32:58 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/936094#M367982</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2024-07-17T20:32:58Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/935961#M367929</link>
      <description>&lt;P&gt;Quentin, I did extensive testing on this issue in SAS and other tools. In SAS, I used SAS Unicode (utf-8) and SAS English (wlatin1).&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;My workaround in SAS Unicode is to run PROC DATASETS like below every time I pull in data from Snowflake, but it only gives me iso-8859-1, which seems to be a limitation of the Snowflake ODBC driver.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;proc datasets library=&amp;amp;lib noprint;&lt;BR /&gt;modify &amp;amp;ds / correctencoding='iso-8859-1';&lt;BR /&gt;quit;&lt;/P&gt;</description>
      <pubDate>Tue, 16 Jul 2024 21:21:44 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/935961#M367929</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2024-07-16T21:21:44Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/935864#M367911</link>
      <description>&lt;P&gt;I've pursued this with official SAS support, Snowflake support, unofficial channels like this online forum, and our IT department. SAS and Snowflake pointed fingers at each other, and I haven't gotten anywhere.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Based on some tests like the Microsoft ODBC Test utility in the MDAC package, the problem seems to be that the Snowflake driver never returns UTF-8, despite what Snowflake documentation states. This low-level utility is not user friendly, but easier ways to demonstrate that the problem is with the Snowflake ODBC driver are to use other ODBC clients such as Microsoft Excel or Microsoft Access.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The test procedure is simple: just create a table with a little bit of data, and then use a client like Excel to query it.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;create or replace TABLE zzz_unicode ( language_code int, text varchar(100) );

insert into zzz_unicode (language_code, text) values

(1, 'Ich kann Glasssplitter essen, es tut mir nicht weh'),

(2, 'Je peux manger du verre, ça ne me fait pas mal'),

(3, 'Posso mangiare vetro, non mi fa male'),

(4, 'Eu posso comer vidro, não me faz mal'),

(5, 'Puedo comer vidrio, no me hace daño'),

(6, 'Я могу есть битое стекло, оно мне не вредит'),

(7, 'ฉันสามารถกินแก้วแตกได้ มันไม่ทำให้ฉันเจ็บปวด'),

(8, '私は割れたガラスを食べることができます、それは私を傷つけません'),

(9, 'እኔ የተሰነጠቀ ብረት መብላት እችላለሁ፣ አይጎዳኝም'),

(10, 'ငါ ብስጭት መብላት እችላለሁ, ጎጂ አይደለም');

insert into zzz_unicode (language_code, text) values

(11, 'Я могу есть битое стекло, оно мне не вредит'),

(12, '私は割れたガラスを食べることができます、それは私を傷つけません'),

(13, '我可以吃碎玻璃，它不会伤害我');

&amp;nbsp;

select *

from zzz_unicode

;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If you make progress, let me know please.&lt;/P&gt;</description>
      <pubDate>Mon, 15 Jul 2024 20:56:56 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/935864#M367911</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2024-07-15T20:56:56Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/893051#M352770</link>
      <description>&lt;P&gt;OK, I opened a ticket with SAS support. Let's see how that goes.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The COPY TO is a creative suggestion. I already do the opposite for bulk loading data into Snowflake, so if I get desperate, I will check into that.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I could also write a Python program to read the data from Snowflake and pass it to SAS. Python can write SAS format via sas7bdat, but I am guessing it doesn't support Unicode. Maybe XLSX or Access would be the easiest way. This doesn't seem fun, either.&lt;/P&gt;</description>
      <pubDate>Thu, 07 Sep 2023 00:19:10 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/893051#M352770</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2023-09-07T00:19:10Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/892392#M352442</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/159"&gt;@Tom&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;How would I change the ODBC definition for Snowflake? The URL you posted was about PostgreSQL, and in the last comment, the author (not the original poster) wrote that he solved by adding "&lt;SPAN&gt;SET CLIENT_ENCODING TO 'UTF8'" to his connection string, but I see no such option for Snowflake. &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/13674"&gt;@LinusH&lt;/a&gt;&amp;nbsp;posted a link to Snowflake documentation that states:&amp;nbsp;"the Snowflake ODBC Driver will &lt;EM&gt;always&lt;/EM&gt; use UTF-8 encoding for any STRING/VARCHAR/VARIANT type column data."&amp;nbsp;This is what I need, and I can't change it.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;Also, I do not see any encoding related options in Snowflake &lt;A href="https://docs.snowflake.com/en/sql-reference/parameters" target="_self"&gt;session parameters&lt;/A&gt; or &lt;A href="https://docs.snowflake.com/developer-guide/odbc/odbc-parameters" target="_self"&gt;ODBC parameters&lt;/A&gt;. This is consistent with the documentation the the "ODBC driver will always use UTF-8" because there is no way to change it.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Because Snowflake always uses UTF-8 leads me to conclude that the problem, whether SAS configuration or SAS bug, is with SAS's ODBC integration.&lt;/P&gt;</description>
      <pubDate>Fri, 01 Sep 2023 22:50:51 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/892392#M352442</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2023-09-01T22:50:51Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/892387#M352440</link>
      <description>&lt;P&gt;I created a Unicode table in Snowflake via DBeaver with this code&lt;/P&gt;&lt;PRE&gt;create or replace TABLE zzz_unicode ( language_code int, text varchar(100) );
insert into zzz_unicode (language_code, text) values
(1, 'Ich kann Glasssplitter essen, es tut mir nicht weh'),
(2, 'Je peux manger du verre, ça ne me fait pas mal'),
(3, 'Posso mangiare vetro, non mi fa male'),
(4, 'Eu posso comer vidro, não me faz mal'),
(5, 'Puedo comer vidrio, no me hace daño'),
(6, 'Я могу есть битое стекло, оно мне не вредит'),
(7, 'ฉันสามารถกินแก้วแตกได้ มันไม่ทำให้ฉันเจ็บปวด'),
(8, '私は割れたガラスを食べることができます、それは私を傷つけません'),
(9, 'እኔ የተሰነጠቀ ብረት መብላት እችላለሁ፣ አይጎዳኝም'),
(10, 'ငါ ብስጭት መብላት እችላለሁ, ጎጂ አይደለም');
insert into zzz_unicode (language_code, text) values
(11, 'Я могу есть битое стекло, оно мне не вредит'),
(12, '私は割れたガラスを食べることができます、それは私を傷つけません'),
(13, '我可以吃碎玻璃，它不会伤害我');

select *
from zzz_unicode
;&lt;/PRE&gt;&lt;P&gt;This shows how the table looks in DBeaver (looks good) vs SAS PROC PRINT (looks bad).&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="I can eat broken glass.png" style="width: 880px;"&gt;&lt;img src="https://communities.sas.com/t5/image/serverpage/image-id/87562iFD437EFC99912DF9/image-size/large?v=v2&amp;amp;px=999" role="button" title="I can eat broken glass.png" alt="I can eat broken glass.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here is the SAS log for the HEX test you suggested. The character 1A is SUB, and 20 is space, so we see the mojibake in HEX too. Also, it doesn't like encoding=any.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;348
349 data test;
350 set jet.zzz_unicode(encoding=any);
--------
76
WARNING 76-63: The option ENCODING is not implemented in the ODBC engine.

351 if language_code in (6,13);
352 put (text) (=$hex. /);
353 run;

TEXT=1A201A1A1A1A201A1A1A1A201A1A1A1A1A201A1A1A1A1A1A2C201A1A1A201A1A1A201A1A201A1A1A1A1A1A20202020202020202020202020202020202020202020202020202020202
0202020202020202020202020202020202020202020202020202020
TEXT=1A1A1A1A1A1A1A1A1A1A1A1A1A1A202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202
0202020202020202020202020202020202020202020202020202020&lt;/PRE&gt;</description>
      <pubDate>Fri, 01 Sep 2023 20:48:16 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/892387#M352440</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2023-09-01T20:48:16Z</dc:date>
    </item>
    <item>
      <title>Re: Reading Unicode from Snowflake into SAS via ODBC</title>
      <link>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/892341#M352438</link>
      <description>&lt;P&gt;&lt;a href="https://communities.sas.com/t5/user/viewprofilepage/user-id/13674"&gt;@LinusH&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;"proc options option=encoding; run;" tells me my encoding is UTF-8, and when I check the properties of the SAS data set under WORK (created by querying Snowflake), the encoding is UTF-8 too.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;What do you mean a clean location? I am creating a v9 SAS data in WORK.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 01 Sep 2023 16:21:45 GMT</pubDate>
      <guid>https://communities.sas.com/t5/SAS-Programming/Reading-Unicode-from-Snowflake-into-SAS-via-ODBC/m-p/892341#M352438</guid>
      <dc:creator>AndrewZ</dc:creator>
      <dc:date>2023-09-01T16:21:45Z</dc:date>
    </item>
  </channel>
</rss>

