- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
I try to migrate from Oracle Exadata to Snowflake.
I use SAS/ACCESS Interface to ODBC to connect from SAS to Snowflake.
I define a library using the Data Library Manager in the SAS Management Console, and SAS uses that definition to reference the external database.
When I defined an Oracle table in Snowflake, the SAS data type is read as VARCHAR, which is a different data type than expected.
As a result, the data type that is NVARCHAR type and 2 bytes per character in Oracle is VARCHAR type and 1 byte per character in Snowflake.
Table definition
Oracle
CREATE TABLE oracle_table (col1 NVARCHAR(100));
Snowflake
CREATE TABLE snowflake_table (col1 VARCHAR(100));
SAS
Oracle: 200Byte
Snowflake: 100Byte
Even if I create a table in Snowflake with NVARCHAR2, the column metadata will be VARCHAR, and
it cannot be defined with NVARCHAR2.
Is it possible to change the SAS table properties and define the string data type of a Snowflake table as 2 bytes per character?
Conditions
I cannot change the Snowflake table definition to double the number of characters (e.g. VARCHAR(200)).
The SAS/ACCESS Interface to Snowflake cannot be used.
Thank you.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
I do not understand your description.
Did you define the Snowflake table using Snowflake commands? Show an example of creating a table in Snowflake.
What do mean by " the SAS data type is read as VARCHAR" ? SAS datasets only have two types of variables. Fixed length character strings and floating point numbers. All of various character types you are talking about that exist in other databases will have to be mapped into one of those two types to have the data appear as if it was a SAS dataset.