BookmarkSubscribeRSS Feed
opti_miser
Calcite | Level 5

I am performing an optimization with a large set of data.  I can provide analytic gradient and hessian modules, but they are computationally intensive, like the function module.  Of the optimization subroutines in IML (call NLP__), which one is the fastest?

1 REPLY 1
Rick_SAS
SAS Super FREQ

Gosh, speed depends on so many things, you should really time it for the problem you have with, say, 10% of the data.

Given that you have Jacobian and Hessian information, I'd probably use one of the Newton techniques. The Newton-Raphons (NLPNRA) algorithm has quadratic convergence near the solution, so I use it when I can get a good starting guess.   I'd  use  the quasi-Newton algorithm (NLPQN) when the Hessian matrix is much more expensive to compute than the function and derivatives.

SAS Innovate 2025: Call for Content

Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!

Submit your idea!

Multiple Linear Regression in SAS

Learn how to run multiple linear regression models with and without interactions, presented by SAS user Alex Chaplin.

Find more tutorials on the SAS Users YouTube channel.

From The DO Loop
Want more? Visit our blog for more articles like these.
Discussion stats
  • 1 reply
  • 1073 views
  • 0 likes
  • 2 in conversation