Your favorite internet search engine should find a number of articles on Reduced Rank Regression.
With regard to this:
Hoffman compared RRR vs PLS and concluded that RRR is more robust compared to PLS
if your only criterion for success if the fit of the response variables, then I agree. Sometimes there are other criteria, and if your criteria for success is model stability and robustness to multicollinearity among the x-variables, or modeling and understanding the variation in the X-variables, then I disagree.
@PaigeMiller wrote:
With regard to this:
Hoffman compared RRR vs PLS and concluded that RRR is more robust compared to PLS
if your only criterion for success if the fit of the response variables, then I agree. Sometimes there are other criteria, and if your criteria for success is model stability and robustness to multicollinearity among the x-variables, or modeling and understanding the variation in the X-variables, then I disagree.
Yesterday, I said the above. However, I'd like to modify what I said to: Hoffman never uses the word "robust". He makes no claims about RRR being more "robust". What he did show was that for the data he was using, RRR predicted more of the response variation than the other methods. But again I add: Sometimes there are other criteria than getting the highest amount of response variation predicted, and if your criteria for success is model stability and robustness to multicollinearity among the x-variables, or modeling and understanding the variation in the X-variables, then methods other than RRR will be useful.
You also said:
it was explained that RRR in SAS uses OLS and is the most stable compared to the other 3 methods
Hoffman never uses the word "stable" either. I would claim other methods are more "stable" than RRR, but I don't have research to prove that.
I was able to locate a detailed explanation for PLS but did not succeed in seeing the documentation for RRR although it was explained that RRR in SAS uses OLS and is the most stable compared to the other 3 methods
On Page 7607; Cross Validation - None of the regression methods implemented in the PLS procedure fit the observed data any better than ordinary least squares (OLS) regression. On the description 7604 - In reduced rank regression,
the Y-weights qi are the eigenvectors of the covariance matrix YY of the responses predicted by ordinary
least squares regression; the X-scores are the projections of the Y-scores Yqi onto the X space.
If you have a link to the detailed documentation of the RRR kindly share.
@LindaP wrote:
On Page 7607; Cross Validation - None of the regression methods implemented in the PLS procedure fit the observed data any better than ordinary least squares (OLS) regression.
Yes, I would expect OLS to fit better. However, dimension reduction techniques provide value, even if the fit is not as good. One value is that PLS is robust against multicollinearity in the X variables, while OLS can be severely affected by multicollinearity in the X variables. There are other benefits to dimension reduction techniques as well.
If you have a link to the detailed documentation of the RRR kindly share.
Internet search finds many such documentation.
Thank you Paige. Kindly share if you have a specific reference that would be helpful? I wrote this question here because I have exhausted internet sources within my reach and could not find a definitive answer.
I don't have a reference. I'm sure there are plenty of documents that explain RRR out there. If you can't find a definitive reference, please be specific about what the documents on the internet are not providing.
Join us for SAS Innovate 2025, our biggest and most exciting global event of the year, in Orlando, FL, from May 6-9.
Early bird rate extended! Save $200 when you sign up by March 31.
Ready to level-up your skills? Choose your own adventure.