BookmarkSubscribeRSS Feed
☑ This topic is solved. Need further help from the community? Please sign in and ask a new question.
persephone
Fluorite | Level 6

We are moving off Netezza to Snowflake and have discovered that SAS handles BIGINT formats <= 15 and truncates anything >15.  Is there a way to capture BIGINT values >15 without casting to character?

 

1 ACCEPTED SOLUTION

Accepted Solutions
persephone
Fluorite | Level 6
Thank you for your prompt response. We identified the issue using the where clause. Reading the data in as character is not a desirable method because the variables are keys. The only solution we have at this point is to remap the keys.

View solution in original post

7 REPLIES 7
LinusH
Tourmaline | Level 20

According to the documentation - no:

Note: When performing calculations on numeric values and when storing numeric values, SAS maintains up to 15 digits of precision. When you read values that contain more than 15 decimal digits of precision from a database into SAS, the values that SAS reads are rounded to meet this condition. When you use a large numeric value in a WHERE clause, this rounding can cause unexpected results, such as not selecting desired rows. For noncomputational purposes, such as storing ID values or credit card numbers, you can read the data in as character data.

...

When you read numbers in from an external DBMS that supports precision beyond 15 digits, you can lose that precision. You cannot do anything about this for existing databases. However, when you design new databases, you can set constraints to limit precision to about 15 digits. Alternatively, you can select a numeric DBMS data type to match the numeric SAS data type.

 

Data never sleeps
persephone
Fluorite | Level 6
Thank you for your prompt response. We identified the issue using the where clause. Reading the data in as character is not a desirable method because the variables are keys. The only solution we have at this point is to remap the keys.
SASKiwi
PROC Star

"Reading the data in as character is not a desirable method because the variables are keys."  - In my experience, most data keys are character, be it customer IDs, account IDs and so on. Their big advantage is they don't have precision problems like numeric columns. You could create a character copy of the Bigint Snowflake key in Snowflake and use that when reading into SAS.

bitcruncher
Obsidian | Level 7
Fair comment this, but when we struck this issue it was mainframe SAS (17 digits of precision I think) striking a packed decimal value that hit 22 digits.,, Nothing to do with keys, so its not always a key problem. I had to redefine this as character in SAS, if I recall right SAS read the PD but couldnt store it numeric. As our SAS process endgame is always a delimited flat file, writing it out was fine and the other end can define however it likes, they were fine loading it to SNOW. Anyway point is its not necessarily a key that can strike this issue.
SASKiwi
PROC Star

Agreed. However I've yet to see a use case where numerics required more than 15 digits of precision in calculations.

Kurt_Bremser
Super User

Keys should always be stored as character. Your DB designers need to correct this ASAP.

And consider to switch to the use of UUIDs, which are stored as 16-byte strings internally, but represented as 36-character strings when exported or displayed.

SAS Innovate 2025: Register Now

Registration is now open for SAS Innovate 2025 , our biggest and most exciting global event of the year! Join us in Orlando, FL, May 6-9.
Sign up by Dec. 31 to get the 2024 rate of just $495.
Register now!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

SAS Training: Just a Click Away

 Ready to level-up your skills? Choose your own adventure.

Browse our catalog!

Discussion stats
  • 7 replies
  • 1283 views
  • 1 like
  • 5 in conversation