- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Posted 06-01-2013 06:53 PM
(1138 views)
I am performing an optimization with a large set of data. I can provide analytic gradient and hessian modules, but they are computationally intensive, like the function module. Of the optimization subroutines in IML (call NLP__), which one is the fastest?
1 REPLY 1
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Gosh, speed depends on so many things, you should really time it for the problem you have with, say, 10% of the data.
Given that you have Jacobian and Hessian information, I'd probably use one of the Newton techniques. The Newton-Raphons (NLPNRA) algorithm has quadratic convergence near the solution, so I use it when I can get a good starting guess. I'd use the quasi-Newton algorithm (NLPQN) when the Hessian matrix is much more expensive to compute than the function and derivatives.