Hi everyone.
I'm wondering if anyone could shed some light on this.
I don't understand why the confidence interval of the least squares means I manually calculated as
(logconc LSMEAN) plus/minus (1.96*Standard Error) is different from that produced by SAS.
My code is as below. Your help is greatly appreciated!
proc glm data=out order=internal;
by Intervention;
class Sample;
model logconc=Sample / solution;
lsmeans Sample / cl pdiff tdiff stderr e om;
run;
quit;
There's no way we can answer your question.
We can't see your data, we can't see the calculations you did, and we can't see the SAS output.
Paige is right. One obvious thing: "1.96" is the t value with a large number of denominator df (say, df >30). The critical t value will be larger for small df.
Thanks very much Paige and Ivm for your reply
Yes, I did it wrong by multiplying by 1.96 :smileysilly:
Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!
ANOVA, or Analysis Of Variance, is used to compare the averages or means of two or more populations to better understand how they differ. Watch this tutorial for more.
Find more tutorials on the SAS Users YouTube channel.