Turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

Options

- RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page

🔒 This topic is **solved** and **locked**.
Need further help from the community? Please
sign in and ask a **new** question.

- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content

Posted 08-05-2021 10:28 PM
(593 views)

Hi,

Could someone explain the 'Optimality Error' in plain English?

... and perhaps recommend a good article about how to interpret the information written to the log by the NLP solver?

Thanks!

1 ACCEPTED SOLUTION

Accepted Solutions

- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content

Do you need more than the info below?

SAS/OR® 15.1 User’s Guide

Mathematical Programming

The Nonlinear Programming Solver

https://support.sas.com/documentation/onlinedoc/or/151/nlpsolver.pdf

Optimality Error :

indicates the relative optimality error (see the section “Solver Termination Criterion” on page 566).

Thanks,

Koen

4 REPLIES 4

- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content

Do you need more than the info below?

SAS/OR® 15.1 User’s Guide

Mathematical Programming

The Nonlinear Programming Solver

https://support.sas.com/documentation/onlinedoc/or/151/nlpsolver.pdf

Optimality Error :

indicates the relative optimality error (see the section “Solver Termination Criterion” on page 566).

Thanks,

Koen

- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content

- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content

The optimality conditions give a characterization for locally optimal solutions of a nonlinear optimization problem. The goal of an optimization process is to find a point that satisfies those conditions. During the optimization process (information written to the log by the solver), the optimality error gives an estimate of how close the NLP solver is to a locally optimal solution. The NLP solver terminates, declaring convergence to a local solution, only if the optimality error (which is the norm of the optimality conditions) is less than the user-defined or default tolerance (usually around 1e-6).

The optimization process is an iterative process that starts from an initial estimate of the solution (starting point) and stops eventually at a locally optimal solution. During this process, the NLP solver generates a sequence of iterates. At every iteration, the NLP solver prints the value of the objective function at the current iterate, the infeasibility error and the optimality error. These information provide an idea about the progress made by the NLP solver towards a locally optimally solution. The infeasibility indicates the maximum value out of all constraint violations (for the original problem before introducing slack variables). It is always equal to zero for unconstrained problems.

When monitoring the information written to the log, the ideal case is that the infeasibility (given by the third column) and the optimality error (given by the fourth column) decrease every iteration. However, given the nonlinearity and nonconvexity of the problem, it is difficult to guarantee that those quantities decrease every iteration. Those quantities can oscillate up and down during the optimization process but converge to zero asymptotically (under some assumptions).

For unconstrained problems, the optimality conditions are defined based on the gradient of the objective function and/or the positive (semi-)definiteness of the Hessian. For general constrained optimization problems, the optimality conditions are more complicated and involve the primal and dual variables and the Lagrangian function. I would recommend reading Chapter 12 of the book “Numerical Optimization” by Jorge Nocedal and Stephen Wright. It gives a detailed description of the basic theory associated with nonlinear optimization. This is a great book to learn about numerical optimization.

In the SAS NLP solver’s documentation, we give more details about nonlinear optimization and the algorithms that are implemented. Here is a link https://support.sas.com/documentation/onlinedoc/or/151/nlpsolver.pdf

- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content

Thanks frario. That's a very clear and detailed explanation. I am starting to have a much better understanding of the concept.

The book looks good too. I will definitely get it to help me improve my optimization skills.

Secure your spot at the must-attend AI and analytics event of 2024: SAS Innovate 2024! Get ready for a jam-packed agenda featuring workshops, super demos, breakout sessions, roundtables, inspiring keynotes and incredible networking events.

Register by March 1 to snag the Early Bird rate of just $695! Don't miss out on this exclusive offer.

** **

Multiple Linear Regression in SAS

Learn how to run multiple linear regression models with and without interactions, presented by SAS user Alex Chaplin.

Find more tutorials on the SAS Users YouTube channel.