BookmarkSubscribeRSS Feed
opti_miser
Calcite | Level 5

I am performing an optimization with a large set of data.  I can provide analytic gradient and hessian modules, but they are computationally intensive, like the function module.  Of the optimization subroutines in IML (call NLP__), which one is the fastest?

1 REPLY 1
Rick_SAS
SAS Super FREQ

Gosh, speed depends on so many things, you should really time it for the problem you have with, say, 10% of the data.

Given that you have Jacobian and Hessian information, I'd probably use one of the Newton techniques. The Newton-Raphons (NLPNRA) algorithm has quadratic convergence near the solution, so I use it when I can get a good starting guess.   I'd  use  the quasi-Newton algorithm (NLPQN) when the Hessian matrix is much more expensive to compute than the function and derivatives.

hackathon24-white-horiz.png

The 2025 SAS Hackathon Kicks Off on June 11!

Watch the live Hackathon Kickoff to get all the essential information about the SAS Hackathon—including how to join, how to participate, and expert tips for success.

YouTube LinkedIn

From The DO Loop
Want more? Visit our blog for more articles like these.
Discussion stats
  • 1 reply
  • 1349 views
  • 0 likes
  • 2 in conversation