BookmarkSubscribeRSS Feed
sampath
Calcite | Level 5

Hi,

I have been trying to use following code .

proc glimmix data=pasi02 (where =(efrslt ^=.));

                    nloptions technique=nrridg;

                                class pid leg_sort viswin;

                                model eff(event='Y') = leg_sort viswin wt leg_sort*viswin leg_sort*wt/ solution dist=binary link=logit covb;

                                random viswin /type=un subject= pid residual;

                                lsmeans leg_sort*viswin/ DIFF slice = viswin ilink;

                                  ESTIMATE "CP-690,550 5 mg BID vs Pbo week 2"

                                      leg_sort 1 0 -1

                              leg_sort*viswin 1 0 0 0 0 0 0 0 0 0 -1

                  leg_sort*wt 1 0 -1;

                                  ESTIMATE "CP-690,550 10 mg BID vs Pbo week 2"

                                      leg_sort 0 1 -1

                              leg_sort*viswin 0 0 0 0 0 1 0 0 0 0 -1

                  leg_sort*wt 0 1 -1;

                                  ESTIMATE "CP-690,550 10 mg BID vs CP-690,550 5 mg BID week 2"

                                      leg_sort -1 1 0

                              leg_sort*viswin -1 0 0 0 0 1

                  leg_sort*wt -1 1 0;

                                  ESTIMATE "CP-690, 5 mg BID vs 550Pbo week 4"

                                      leg_sort 1 0 -1

                              leg_sort*viswin 0 1 0 0 0 0 0 0 0 0 0 -1

                  leg_sort*wt 1 0 -1;

                                  ESTIMATE "CP-690,550 10 mg BID vs Pbo week 4"

                                      leg_sort 0 1 -1

                              leg_sort*viswin 0 0 0 0 0 0 1 0 0 0 0 -1

                  leg_sort*wt 0 1 -1;

                                  

                      ESTIMATE "CP-690,550 10 mg BID vs CP-690,550 5 mg BID week 4"

                                      leg_sort -1 1 0

                               leg_sort*viswin 0 -1 0 0 0 0 1

                    leg_sort*wt -1 1 0 ;

                                  ESTIMATE "CP-690,550 5 mg BID vs Pbo week 8"

                                      leg_sort 1 0 -1

                              leg_sort*viswin 0 0 1 0 0 0 0 0 0 0 0 0 -1

                  leg_sort*wt 1 0 -1;

                      ESTIMATE "CP-690,550 10 mg BID vs Pbo week 8"

                                      leg_sort 0 1 -1

                              leg_sort*viswin 0 0 0 0 0 0 0 1 0 0 0 0 -1

                  leg_sort*wt 0 1 -1 ;

                                 

                      ESTIMATE "CP-690,550 10 mg BID vs CP-690,550 5 mg BID week 8"

                                      leg_sort -1 1 0

                               leg_sort*viswin 0 0 -1 0 0 0 0 1

                   leg_sort*wt -1 1 0 ;

                                 

                                  ESTIMATE "CP-690,550 5 mg BID vs Pbo week 12"

                                      leg_sort 1 0 -1

                              leg_sort*viswin 0 0 0 1 0 0 0 0 0 0 0 0 0 -1

                  leg_sort*wt 1 0 -1;

                      ESTIMATE "CP-690,550 10 mg BID vs Pbo week 12"

                                      leg_sort 0 1 -1

                              leg_sort*viswin 0 0 0 0 0 0 0 0 1 0 0 0 0 -1

                  leg_sort*wt 0 1 -1;

                                 

                      ESTIMATE "CP-690,550 10 mg BID vs CP-690,550 5 mg BID week 12"

                                      leg_sort -1 1 0

                               leg_sort*viswin 0 0 0 -1 0 0 0 0 1

                   leg_sort*wt -1 1 0;  

                                  ESTIMATE "CP-690,550 5 mg BID vs Pbo week 16"

                                      leg_sort 1 0 -1

                              leg_sort*viswin 0 0 0 0 1 0 0 0 0 0 0 0 0 0 -1

                  leg_sort*wt 1 0 -1;

                      ESTIMATE "CP-690,550 10 mg BID vs Pbo week 16"

                                      leg_sort 0 1 -1

                              leg_sort*viswin 0 0 0 0 0 0 0 0 0 1 0 0 0 0 -1

                  leg_sort*wt 0 1 -1;

                                 

                      ESTIMATE "CP-690,550 10 mg BID vs CP-690,550 5 mg BID week 16"

                                     leg_sort -1 1 0

                              leg_sort*viswin 0 0 0 0 -1 0 0 0 0 1

                  leg_sort*wt -1 1 0; 

                ods output estimates=estimates;

                           ods output LSMeans = LSmeans;

                                                  ods output Tests3 = tests;

                                  

               

run;

But I got following error. Please note that the above code worked fine intially. I have been getting this error only after un blinding of data. Could you please suggest any solution?

Error:

NOTE: The GLIMMIX procedure is modeling the probability that eff='Y'.

WARNING: Pseudo-likelihood update fails in outer iteration 17.

NOTE: Did not converge.

WARNING: Output 'Tests3' was not created.  Make sure that the output object name, label, or path

         is spelled correctly.  Also, verify that the appropriate procedure options are used to

         produce the requested output object.  For example, verify that the NOPRINT option is not

         used.

WARNING: Output 'LSMeans' was not created.  Make sure that the output object name, label, or path

         is spelled correctly.  Also, verify that the appropriate procedure options are used to

         produce the requested output object.  For example, verify that the NOPRINT option is not

         used.

WARNING: Output 'estimates' was not created.  Make sure that the output object name, label, or

         path is spelled correctly.  Also, verify that the appropriate procedure options are used

         to produce the requested output object.  For example, verify that the NOPRINT option is

         not used.

6 REPLIES 6
lvm
Rhodochrosite | Level 12 lvm
Rhodochrosite | Level 12

With the default estimation method in GLIMMIX, you are using pseudo-likelihood analysis, which is doubly iterative. There are many reasons why the outer iterations can fail. The new pseudo-data generated from the most recent inner iteration may not be allowable. I see that you have binary data and random effects. It is dangerous to use pseudo-likelihood for these situations. Parameter estimates can be quite biased. I would normally recommend that you use method=laplace in the procedure statement. This singly iterative method give you an actual likelihood for your data (at least as an approximation), and often works well to reduce bias. However, you cannot have so-called R-side covariances. You currently have R-side covariance (based on the residual option in the random statement). Do you really want this?  You are getting a marginal rather than a conditional analysis, and the analysis is really using quasi-likelihoods when an R-side covariance is specified for binary data.

If you really need the R-side analysis, then you should explore other optimization techniques in NLOPTIONS (technique=). Try technique=quanew. THe user's guide has several other choices.

There are many other things that can cause your problem. Try changing type=un to type=chol in the random statement. This will force the covariance matrix to be at least positive semi-definite (although still unstructured).  For more hints, see the following:

https://support.sas.com/resources/papers/proceedings12/332-2012.pdf

vvk
Fluorite | Level 6 vvk
Fluorite | Level 6

Thank you lvm for you response. Myself and Sampath (who actually posted the question) are working together on this issue. The reason what we found out for this error is "there are no subjects in PLACEBO group at week2". since we are comparing TREATMENT vs PLACEBO, the code failed at outer iteration 17. Do you think this is a valid reason?

I also tried the methods/options suggested by you but didn't work for our data.

Thanks again.

Vinay

SteveDenham
Jade | Level 19

Vinay,

You have wandered into a quasi-separation situation, due to the no subject problem.  This will lead to the failure seen, as the inner iteration values are not yielding valid pseudo-data for the outer iteration.  The solution is two-fold.  First, do everything listed, most importantly move to a G side analysis by dropping the residual option from the random statement and changing to method=laplace in the proc glimmix statement.  Second, collapse the data.  It seems that weekly cuts are not going to give good values, so try biweekly or monthly cuts.

Steve Denham


vvk
Fluorite | Level 6 vvk
Fluorite | Level 6

Hi Steve,

I tried solution 1: applied method= Laplace , dropped the residual option in the glimmix code. Still not working.

ERROR: NRRIDG Optimization cannot be completed

I also tried technique= QUANEW but same error.

ERROR: QUANEW Optimization cannot be completed


I really appreciate your help.

Thanks a lot!

Vinay

SteveDenham
Jade | Level 19

Have you tried collapsing weeks?  Maybe just combining weeks 2 and 4?

Steve Denham

vvk
Fluorite | Level 6 vvk
Fluorite | Level 6

Yes it worked when I dropped week 2 data from the input data set. It should work for combined weeks too.

For this we need to update the table mock, for which I need to confirm with our statisticians.

Thanks  a lot,

Vinay

sas-innovate-2024.png

Don't miss out on SAS Innovate - Register now for the FREE Livestream!

Can't make it to Vegas? No problem! Watch our general sessions LIVE or on-demand starting April 17th. Hear from SAS execs, best-selling author Adam Grant, Hot Ones host Sean Evans, top tech journalist Kara Swisher, AI expert Cassie Kozyrkov, and the mind-blowing dance crew iLuminate! Plus, get access to over 20 breakout sessions.

 

Register now!

What is Bayesian Analysis?

Learn the difference between classical and Bayesian statistical approaches and see a few PROC examples to perform Bayesian analysis in this video.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 6 replies
  • 5993 views
  • 0 likes
  • 4 in conversation