06-01-2013 06:53 PM
I am performing an optimization with a large set of data. I can provide analytic gradient and hessian modules, but they are computationally intensive, like the function module. Of the optimization subroutines in IML (call NLP__), which one is the fastest?
06-02-2013 06:12 AM
Gosh, speed depends on so many things, you should really time it for the problem you have with, say, 10% of the data.
Given that you have Jacobian and Hessian information, I'd probably use one of the Newton techniques. The Newton-Raphons (NLPNRA) algorithm has quadratic convergence near the solution, so I use it when I can get a good starting guess. I'd use the quasi-Newton algorithm (NLPQN) when the Hessian matrix is much more expensive to compute than the function and derivatives.