I am performing an optimization with a large set of data. I can provide analytic gradient and hessian modules, but they are computationally intensive, like the function module. Of the optimization subroutines in IML (call NLP__), which one is the fastest?
Gosh, speed depends on so many things, you should really time it for the problem you have with, say, 10% of the data.
Given that you have Jacobian and Hessian information, I'd probably use one of the Newton techniques. The Newton-Raphons (NLPNRA) algorithm has quadratic convergence near the solution, so I use it when I can get a good starting guess. I'd use the quasi-Newton algorithm (NLPQN) when the Hessian matrix is much more expensive to compute than the function and derivatives.
Registration is open! SAS is returning to Vegas for an AI and analytics experience like no other! Whether you're an executive, manager, end user or SAS partner, SAS Innovate is designed for everyone on your team. Register for just $495 by 12/31/2023.
If you are interested in speaking, there is still time to submit a session idea. More details are posted on the website.