I'm currently working with mixed models and I would require to apply a variance stabilizing transformation to my data. I tried applying the simple log transformation, but the result wasn't convincing. I was told that this transformation was part of a family called "Box-Cox transformations" (or simply power transformations). Applying such a transformation requires us specifying a value for a parameter called "lambda". This parameter can be chosen as to maximize the likelihood of the transformed data being as closely normally distributed as possible (with the parameters of this normal distribution being derived from the chosen model).
Does SAS offer a quick and efficient way to find the optimal value of this "lambda" parameter when the selected model is mixed?