You need to add an NLOPTIONS statement to your code. The log likelihood is converging, but it is running into the default maximum number of iterations (=19).
Try adding:
nloptions maxiter=500;
to your GLIMMIX code. Alternatively, you might wish to set the ABSGCONV to 1e-6 (or something like that), as it looks like the max gradient in the iteration history is cycling around at values less than this.
Also, with a non-normal distribution, you may want to change from method=rspl to method=quad (or method=laplace if there are issues with the number of points for adaptive quadrature). However, if you do change to method=quad, you will need to change the RANDOM statement to:
random intercept/subject=block;
as the quadrature method requires that the data be processed by subject. I think you are right for the rest with the gamma.
Regarding plot size, does that put an upper bound on the dependent variable? If the areas being analyzed substantially less than that upper bound then there shouldn't be an issue, but it could possibly result in things like an lsmean being larger than any of the plots if you fit a gamma and a substantial portion of the dependent variables are greater than one-half of the maximum plot size..
In that case, consider fitting a four parameter logistic model, with a random effect, using PROC NLMIXED. There are examples out on the interwebs for that approach. Here is something really simple:
proc nlmixed data=mydata;
model response = a + (b-a)/(1+exp(c*(x-d))) / dist=normal;
parms a=0.0 b=1.0 c=1.0 d=0.0; /*replace b=1.0 with b=<max plot size> */
random intercept / subject=block;
run;
Just something to consider. If there are treatments applied like you have, the code gets a whole boatload more complicated, but there are examples out there on how to incorporate those.
SteveDenham
PS - the arcsine transformed data is an approximation of the logistic model, so that may be why it is looking appropriate.
... View more