SAS/IML Software and Matrix Computations

Statistical programming, matrix languages, and more
BookmarkSubscribeRSS Feed
opti_miser
Calcite | Level 5

I am performing an optimization with a large set of data.  I can provide analytic gradient and hessian modules, but they are computationally intensive, like the function module.  Of the optimization subroutines in IML (call NLP__), which one is the fastest?

1 REPLY 1
Rick_SAS
SAS Super FREQ

Gosh, speed depends on so many things, you should really time it for the problem you have with, say, 10% of the data.

Given that you have Jacobian and Hessian information, I'd probably use one of the Newton techniques. The Newton-Raphons (NLPNRA) algorithm has quadratic convergence near the solution, so I use it when I can get a good starting guess.   I'd  use  the quasi-Newton algorithm (NLPQN) when the Hessian matrix is much more expensive to compute than the function and derivatives.

sas-innovate-wordmark-2025-midnight.png

Register Today!

Join us for SAS Innovate 2025, our biggest and most exciting global event of the year, in Orlando, FL, from May 6-9. Sign up by March 14 for just $795.


Register now!

Multiple Linear Regression in SAS

Learn how to run multiple linear regression models with and without interactions, presented by SAS user Alex Chaplin.

Find more tutorials on the SAS Users YouTube channel.

From The DO Loop
Want more? Visit our blog for more articles like these.
Discussion stats
  • 1 reply
  • 1139 views
  • 0 likes
  • 2 in conversation