Unlike ordinary least-squares regression, which is a direct method, many regression procedures have to solve nonlinear optimization problems in order to find the parameters in the model that best fit the data. The procedure starts with an initial estimate of the parameters and then iteratively refines that estimate until "convergence," which means that the parameters are optimal and further iteration will not improve the parameter estimates.
Unlike ordinary least-squares regression, which is a direct method, many regression procedures have to solve nonlinear optimization problems in order to find the parameters in the model that best fit the data. The procedure starts with an initial estimate of the parameters and then iteratively refines that estimate until "convergence," which means that the parameters are optimal and further iteration will not improve the parameter estimates.