Statistical programming, matrix languages, and more

Looking for fastest optimization subroutines

Reply
Contributor
Posts: 22

Looking for fastest optimization subroutines

I am performing an optimization with a large set of data.  I can provide analytic gradient and hessian modules, but they are computationally intensive, like the function module.  Of the optimization subroutines in IML (call NLP__), which one is the fastest?

SAS Super FREQ
Posts: 3,232

Re: Looking for fastest optimization subroutines

Gosh, speed depends on so many things, you should really time it for the problem you have with, say, 10% of the data.

Given that you have Jacobian and Hessian information, I'd probably use one of the Newton techniques. The Newton-Raphons (NLPNRA) algorithm has quadratic convergence near the solution, so I use it when I can get a good starting guess.   I'd  use  the quasi-Newton algorithm (NLPQN) when the Hessian matrix is much more expensive to compute than the function and derivatives.

Post a Question
Discussion Stats
  • 1 reply
  • 206 views
  • 0 likes
  • 2 in conversation