## Convergenc​e criterion

Super Contributor
Posts: 499

# Convergenc​e criterion

May I request someone to explain the meaning of 'Convergenc​e criterion (GCONV=1E-​8) satisfied' in logistic regression in simple terms?

Super Contributor
Posts: 490

## Re: Convergenc​e criterion

GCONV is one of the Convergence Criteria that you can specify in your Model statement and it is the default one if not specified explicitly in the Model statements with the value of GCONV=1E–8. And this value works as threshold value in terminating the iterative gradient-descent search which is an optimization technique that is applied to minimize the vector of error values at each iteration by starting with an arbitrary initial vector of parameter estimates and repeatedly minimizing error values in small steps.

Model convergence status section in your output is related to your model’s optimization convergence and precision. And it is important to figure out some problems like complete separation or quasi-complete separation.

Super Contributor
Posts: 499

## Re: Convergenc​e criterion

Since I've trouble understanding the technical stuffs,may I request you to explain in layman's term?

SAS Super FREQ
Posts: 3,839

## Re: Convergenc​e criterion

For one-dimensional functions, the derivative is zero when the function reaches a maximum value. The "gradient" is a generalzation of a derivative for multivariable functions. The GCONV= criterion says that the optimization will stop when the "derivative" is very close to zero (smaller than 1E-8).  When that occurs, the log likelihood function should be very close to its maximum value.

Super Contributor
Posts: 499

## Re: Convergenc​e criterion

May I request you to explain about the terms 'derivative' and 'optimization' from your comment?

SAS Super FREQ
Posts: 3,839

## Re: Convergenc​e criterion

"Optimization" means that you are trying to find the largest or smallest value of a function. In statistics, you try to find the "best" function that fits your data. Mathematically, this corresponds to an optimization. For example, a "line of best fit" is the line that minimizes the size of the residuals.

A "derivative" is the slope of a function. In calculus you learn how to compute the slope and you learn that the largest and smallest values often occur when the slope is zero. Think about the top of a hill or the bottom of a valley.

Posts: 1,258

## Re: Convergenc​e criterion

Hi Dr. Rick,

Just to confirm, is that second derivative which should be used for minimum or maximum value?

Regards,

SAS Super FREQ
Posts: 3,839

## Re: Convergenc​e criterion

No, I never mentioned the second derivative. The hills and valleys occur where the FIRST derivative is zero.

However, you are correct thatyou look at the second derivative in order to determine whether you have a hill or valley.

For functions of more than one variable, the first derivative is replaced by the gradient, which is the vector of first partial derivatives. The second derivative is replaced by the Hessian matrix, which is the matrix of second partial derivatives. In addition to hills and valleys, multivariate functions have "saddle points" where the function is a hill in some directions, but a valley in others.