10-26-2017 12:29 PM - edited 10-26-2017 12:33 PM
I have a weird problem. I am operating on numbers which add up to around 800 000 000. I have a dataset which I need to split into minidatasets with 1000 000 records. The problem is that when I split my dataset, I loose decimal precision, for example:
Befote splitting the sum of my costs was 810268311.8536920
After splitting - 810268311.8401380
Splitted datasets have the total number of records equal to number of records which was in the initial table.
What can I do about that?
Thank you in advance!
10-26-2017 12:36 PM
Your numbers never had the accuracy. The 8 byte floating point numeric format used by SAS can only hold up to fifteen significant digits accurately. Your numbers have sixteen digits; you will lose precision on the sixteenth and above digits.
10-26-2017 12:43 PM - edited 10-26-2017 12:51 PM
Hmm so what is there to do about that? Can I at least keep the number at 0.01 decimal precision? 810268311.85 - so that I could get this exact number after splitting, I do not need anything more precise. I"ve tried adding round(cost,0.01) to my data step which splits the data but that didn't help.
10-26-2017 01:01 PM
There is an exhausting treatise on this topic here:
The behavior of SAS is the same as any software that relies on floating point math, defined by an IEEE standard.
There is a newer DECIMALCONV= option that might help. From the doc:
specifies to convert and format decimal values by using the IEEE Standard for Floating-Point Arithmetic 754-2008. Using the STDIEEE argument improves the accuracy and readability of floating-point numbers. In some cases, more significant digits can be displayed in the same field width.
10-26-2017 01:33 PM
To the extent you can work with integers instead of decimal fractions, the results will improve. If it would be practical, try:
Those totals should be closer.