turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

Find a Community

- Home
- /
- SAS Programming
- /
- SAS Procedures
- /
- proc means help

Topic Options

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

10-05-2016 03:29 PM

I am feeling excited to know, if someone can really understand and crack my question.

Prco means data=ghy1;

var wew;

sum(wew)=wev;

The above delivers proper result for sum upto 20 million records, it was a failure for 50 million records.

I am seeing a slight deviation calculated using data stepI(when decimal was rounded using round fuction)

Is there any option to tell proc means about the sum round for decimals.

Really happy if someone carcks this...

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

Posted in reply to CoolBoy

10-05-2016 03:39 PM

Proc means supports and option MAXDEC to control the maximum number of decimals displayed.

You may want something like

Prco means data=ghy1 maxdec=2;

var wew;

sum(wew)=wev;

run;

BUT if your data step rounded before accumulating then you introduced a lot of difference, likely some for each record.

We would have to see your data step code to provide a better reason for differences.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

Posted in reply to ballardw

10-06-2016 02:00 AM

**Sum by Proc means**

Prco means data=ghy1;

var wew;

sum(wew)=wev;

**Sum by data step**

Data GHY1;

infile xxx;

input yyyy;

wev = round(wev,.0001) + wew;

Maxdec option don't have any affect on sum function(any statsitical analysis), it only effects printing of decimal.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

Posted in reply to CoolBoy

10-06-2016 02:18 AM

You're processing the data before you do the sum. The only way I can think to match this is with a custom format that uses a function to round the data. Or maybe the round in proc format.

Its not a reasonable expectation to have the results from proc means and the datastep match across 50 million observations in this situation. .

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

Posted in reply to CoolBoy

10-05-2016 03:41 PM

CoolBoy wrote:

I am feeling excited to know, if someone can really understand and crack my question.

We can't understand your question if you don't provide enough details.

For example, what does 'failure' mean? Did it error out or were the results not what you expected?

Are you rounding before summing the variables in the data step?

I don't think there's going to be a way to mimic that in Proc Means.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

Posted in reply to CoolBoy

10-05-2016 03:58 PM

This is probably related to precision of storing large numbers.

SAS can accurately store integers up to around 1e15 or 1e16. Decimal fractions can be problematic, even when the numbers are smaller.

Here, you are saying that 20M records produces the "right" result, but 50M records the "wrong" result. Since you are calculating a sum, this leads me to believe that the sum of 50M values increases the sum to beyond what SAS can accurately store. Internally, it is entirely possible that SAS uses different methods in PROC MEANS vs. a DATA step. The addition process would be the same, but the order in which the numbers are added might be different.

By the way, how do you know which one is correct (DATA step vs. PROC MEANS)? Or could they both be incorrect? You can check this by getting the sum of batches: first 20M records, next 20M records, final 10M records, and comparing their total to the 50M sum.