Unlike ordinary least-squares regression, which is a direct method, many regression procedures have to solve nonlinear optimization problems in order to find the parameters in the model that best fit the data. The procedure starts with an initial estimate of the parameters and then iteratively refines that estimate until "convergence," which means that the parameters are optimal and further iteration will not improve the parameter estimates.
.
Unlike ordinary least-squares regression, which is a direct method, many regression procedures have to solve nonlinear optimization problems in order to find the parameters in the model that best fit the data. The procedure starts with an initial estimate of the parameters and then iteratively refines that estimate until "convergence," which means that the parameters are optimal and further iteration will not improve the parameter estimates.
.
It means the solution found is possibly not the solution, but the algorithm couldn't continue further.
It could also mean you have too many model terms in your model, and removing some may allow the algorithm to find the solution.
Don't miss out on SAS Innovate - Register now for the FREE Livestream!
Can't make it to Vegas? No problem! Watch our general sessions LIVE or on-demand starting April 17th. Hear from SAS execs, best-selling author Adam Grant, Hot Ones host Sean Evans, top tech journalist Kara Swisher, AI expert Cassie Kozyrkov, and the mind-blowing dance crew iLuminate! Plus, get access to over 20 breakout sessions.
Learn how to run multiple linear regression models with and without interactions, presented by SAS user Alex Chaplin.
Find more tutorials on the SAS Users YouTube channel.