02-08-2013 01:48 AM
I'm wondering if anyone could shed some light on this.
I don't understand why the confidence interval of the least squares means I manually calculated as
(logconc LSMEAN) plus/minus (1.96*Standard Error) is different from that produced by SAS.
My code is as below. Your help is greatly appreciated!
proc glm data=out order=internal;
model logconc=Sample / solution;
lsmeans Sample / cl pdiff tdiff stderr e om;
02-08-2013 08:06 AM
There's no way we can answer your question.
We can't see your data, we can't see the calculations you did, and we can't see the SAS output.
02-08-2013 09:42 AM
Paige is right. One obvious thing: "1.96" is the t value with a large number of denominator df (say, df >30). The critical t value will be larger for small df.