BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
heinsler
Fluorite | Level 6

Hi,

 

Could someone explain the 'Optimality Error' in plain English?

 

... and perhaps recommend a good  article about how to interpret the information written to the log by the NLP solver?

 

Thanks!

1 ACCEPTED SOLUTION

Accepted Solutions
sbxkoenk
SAS Super FREQ

Do you need more than the info below?

 

SAS/OR® 15.1 User’s Guide
Mathematical Programming
The Nonlinear Programming Solver
https://support.sas.com/documentation/onlinedoc/or/151/nlpsolver.pdf

 

Optimality Error :
indicates the relative optimality error (see the section “Solver Termination Criterion” on page 566).

 

Thanks,

Koen

View solution in original post

4 REPLIES 4
sbxkoenk
SAS Super FREQ

Do you need more than the info below?

 

SAS/OR® 15.1 User’s Guide
Mathematical Programming
The Nonlinear Programming Solver
https://support.sas.com/documentation/onlinedoc/or/151/nlpsolver.pdf

 

Optimality Error :
indicates the relative optimality error (see the section “Solver Termination Criterion” on page 566).

 

Thanks,

Koen

heinsler
Fluorite | Level 6
Thanks Koen.

I also found on page 569:

OPTIMALITY_ERROR:
indicates the norm of the optimality conditions at the solution.
frario
SAS Employee

The optimality conditions give a characterization for locally optimal solutions of a nonlinear optimization problem. The goal of an optimization process is to find a point that satisfies those conditions. During the optimization process (information written to the log by the solver), the optimality error gives an estimate of how close the NLP solver is to a locally optimal solution. The NLP solver terminates, declaring convergence to a local solution, only if the optimality error (which is the norm of the optimality conditions) is less than the user-defined or default tolerance (usually around 1e-6).

 

The optimization process is an iterative process that starts from an initial estimate of the solution (starting point) and stops eventually at a locally optimal solution. During this process, the NLP solver generates a sequence of iterates. At every iteration, the NLP solver prints the value of the objective function at the current iterate, the infeasibility error and the optimality error. These information provide an idea about the progress made by the NLP solver towards a locally optimally solution. The infeasibility indicates the maximum value out of all constraint violations (for the original problem before introducing slack variables). It is always equal to zero for unconstrained problems.

 

When monitoring the information written to the log, the ideal case is that the infeasibility (given by the third column) and the optimality error (given by the fourth column) decrease every iteration. However, given the nonlinearity and nonconvexity of the problem, it is difficult to guarantee that those quantities decrease every iteration. Those quantities can oscillate up and down during the optimization process but converge to zero asymptotically (under some assumptions).

 

For unconstrained problems, the optimality conditions are defined based on the gradient of the objective function and/or the positive (semi-)definiteness of the Hessian. For general constrained optimization problems, the optimality conditions are more complicated and involve the primal and dual variables and the Lagrangian function. I would recommend reading Chapter 12 of the book “Numerical Optimization” by Jorge Nocedal and Stephen Wright. It gives a detailed description of the basic theory associated with nonlinear optimization. This is a great book to learn about numerical optimization.

 

In the SAS NLP solver’s documentation, we give more details about nonlinear optimization and the algorithms that are implemented. Here is a link https://support.sas.com/documentation/onlinedoc/or/151/nlpsolver.pdf

heinsler
Fluorite | Level 6

Thanks frario. That's a very clear and detailed explanation. I am starting to have a much better understanding of the concept.

 

The book looks good too. I will definitely get it to help me improve my optimization skills.

 

 

Ready to join fellow brilliant minds for the SAS Hackathon?

Build your skills. Make connections. Enjoy creative freedom. Maybe change the world. Registration is now open through August 30th. Visit the SAS Hackathon homepage.

Register today!
Multiple Linear Regression in SAS

Learn how to run multiple linear regression models with and without interactions, presented by SAS user Alex Chaplin.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 4 replies
  • 797 views
  • 2 likes
  • 3 in conversation