<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Ordinal Logistic Regression in Statistical Procedures</title>
    <link>https://communities.sas.com/t5/Statistical-Procedures/Ordinal-Logistic-Regression/m-p/671155#M32080</link>
    <description>&lt;P&gt;Welcome to the world of correlated predictors. Even though you said the two predictors were not highly correlated, I'm sure the correlation wasn't zero either, and then adding in the interaction later creates a different (and usually more) multi-collinearity to the party. And so, what you expect to happen is reasonable, two predictors are significant, then you add in the interaction and you expect the two predictors to remain significant; but that's not what happens. The multi-collinearity has an impact here and sometimes it is counter-intuitive. But this is a deficiency of most regression methods, that adding terms into the model cause previously significant terms to become insignificant, or vice versa. Isn't that awesome? &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Well, what's the solution? Stepwise is not the solution, it has the same deficiencies. I like to use Partial Least Squares regression (PROC PLS) which is much less susceptible to these problems, but PROC PLS only works for continuous Y. There is a &lt;A href="https://cedric.cnam.fr/fichiers/RC906.pdf" target="_self"&gt;logistic version&lt;/A&gt; of the algorithm out there on the internet, and an R-package, but if you are using SAS, there is no such algorithm that I know of (except for the one I wrote, and I doubt my employer would want me to share it).&lt;/P&gt;</description>
    <pubDate>Tue, 21 Jul 2020 18:07:57 GMT</pubDate>
    <dc:creator>PaigeMiller</dc:creator>
    <dc:date>2020-07-21T18:07:57Z</dc:date>
    <item>
      <title>Ordinal Logistic Regression</title>
      <link>https://communities.sas.com/t5/Statistical-Procedures/Ordinal-Logistic-Regression/m-p/671134#M32079</link>
      <description>&lt;P&gt;Hi: I performed an ordinal logistic regression (dependent variable has values of 0,1, and 2) which I was interested in with respect to two predictors (one is a weight variable) and the other an age variable. Before running the regression I checked to see that the two predictors were not highly correlated (they weren't). When I ran the model, I found that it was statistically significant as were the two predictors. I was thrilled.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Then I thought maybe I needed to check for an interaction effect. When I added the interaction effect, neither main effects nor the interaction were significant.&amp;nbsp; Finally, I&amp;nbsp; then checked to see if the interaction alone (without the individual predictors included in the model) was significant-- and it was.&amp;nbsp; I am just wondering how someone with better statistical knowledge than me would make sense of these results. Thank you! Cheers, DJGS&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2020 17:27:07 GMT</pubDate>
      <guid>https://communities.sas.com/t5/Statistical-Procedures/Ordinal-Logistic-Regression/m-p/671134#M32079</guid>
      <dc:creator>DJGS</dc:creator>
      <dc:date>2020-07-21T17:27:07Z</dc:date>
    </item>
    <item>
      <title>Re: Ordinal Logistic Regression</title>
      <link>https://communities.sas.com/t5/Statistical-Procedures/Ordinal-Logistic-Regression/m-p/671155#M32080</link>
      <description>&lt;P&gt;Welcome to the world of correlated predictors. Even though you said the two predictors were not highly correlated, I'm sure the correlation wasn't zero either, and then adding in the interaction later creates a different (and usually more) multi-collinearity to the party. And so, what you expect to happen is reasonable, two predictors are significant, then you add in the interaction and you expect the two predictors to remain significant; but that's not what happens. The multi-collinearity has an impact here and sometimes it is counter-intuitive. But this is a deficiency of most regression methods, that adding terms into the model cause previously significant terms to become insignificant, or vice versa. Isn't that awesome? &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Well, what's the solution? Stepwise is not the solution, it has the same deficiencies. I like to use Partial Least Squares regression (PROC PLS) which is much less susceptible to these problems, but PROC PLS only works for continuous Y. There is a &lt;A href="https://cedric.cnam.fr/fichiers/RC906.pdf" target="_self"&gt;logistic version&lt;/A&gt; of the algorithm out there on the internet, and an R-package, but if you are using SAS, there is no such algorithm that I know of (except for the one I wrote, and I doubt my employer would want me to share it).&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2020 18:07:57 GMT</pubDate>
      <guid>https://communities.sas.com/t5/Statistical-Procedures/Ordinal-Logistic-Regression/m-p/671155#M32080</guid>
      <dc:creator>PaigeMiller</dc:creator>
      <dc:date>2020-07-21T18:07:57Z</dc:date>
    </item>
    <item>
      <title>Re: Ordinal Logistic Regression</title>
      <link>https://communities.sas.com/t5/Statistical-Procedures/Ordinal-Logistic-Regression/m-p/671165#M32081</link>
      <description>&lt;P&gt;Thanks-- that is very helpful. Cheers, DJGS&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2020 18:18:13 GMT</pubDate>
      <guid>https://communities.sas.com/t5/Statistical-Procedures/Ordinal-Logistic-Regression/m-p/671165#M32081</guid>
      <dc:creator>DJGS</dc:creator>
      <dc:date>2020-07-21T18:18:13Z</dc:date>
    </item>
  </channel>
</rss>

