Unlike ordinary least-squares regression, which is a direct method, many regression procedures have to solve nonlinear optimization problems in order to find the parameters in the model that best fit the data. The procedure starts with an initial estimate of the parameters and then iteratively refines that estimate until "convergence," which means that the parameters are optimal and further iteration will not improve the parameter estimates.
Unlike ordinary least-squares regression, which is a direct method, many regression procedures have to solve nonlinear optimization problems in order to find the parameters in the model that best fit the data. The procedure starts with an initial estimate of the parameters and then iteratively refines that estimate until "convergence," which means that the parameters are optimal and further iteration will not improve the parameter estimates.
It means the solution found is possibly not the solution, but the algorithm couldn't continue further.
It could also mean you have too many model terms in your model, and removing some may allow the algorithm to find the solution.
-- Paige Miller
April 27 – 30 | Gaylord Texan | Grapevine, Texas
Registration is open
Walk in ready to learn. Walk out ready to deliver. This is the data and AI conference you can't afford to miss. Register now and lock in 2025 pricing—just $495!