I have a question regarding the use of the objective function and the gradient when calling an optimization routine.
I am interested in estimating an economic model that is similar to a demand model for differentiated products (similar to BLP). Therefore, this is a relatively standard problem in several literatures, but I haven't managed to find an answer to this question in the SAS/IML documentation.
I have coded up the objective function that consists of three steps
1. Feed in a coefficient vector;
2. Compute a outcome vector using a contraction mapping conditional on the coefficient vector and save the contraction mapping vector (CM vector);
3. Compute the objective function value.
I have coded up the jacobian that takes in the value of the (saved) CM vector and the coefficient vector.
As far as I understand, it is not possible in SAS to write one function that spits out both the objective function and gradient at the same time, given how the optimizer syntax works (the jacobian needs to be a separate function module).
See a stylized example below (with the key part highlighted in red):
===============================================================================
proc iml;
use work.DS;
read all var {Y} into Y;
read all var {X} into X;
read all var {Delta0} into Delta0;
close work.DS;
/* SET UP CONTRACTION MAPPING FUNCTION */
start ContractionMapping(theta,DeltaInit) global(X,Y,Delta0);
norm = 1;
Tol = 1e-5;
do while (norm>Tol);
Delta = f(DeltaInit,X,Y,theta);
t=abs(Delta-DeltaInit);
norm = max(t);
end;
Delta0=Delta;
/* HERE THE VALUE GETS SAVED */
finish ContractionMapping;
/* SET UP OBJECTIVE FUNCTION */
start Obj(theta) global(X,Y,Delta0);
DeltaNew = ContractionMapping(theta,Delta0);
Obj = g(DeltaNew,X,Y);
return(Obj);
finish Obj;
/* SET UP JACOBIAN */
start Jacobian(theta) global(X,Y,Delta0);
J = g(X,Y,Delta0,theta);
/* HERE THE JACOBIAN RELIES ON THE SAVED VALUE OF DELTA0 */
return(J);
finish Jacobian;
quit;
===============================================================================
Here comes the question: when I call the optimization routine (e.g. QuasiNewton) with the value function, I iteratively pass coefficient vectors in order to maximize the objective function, but it's not clear to me whether the jacobian I pass is based on the actual new value of the contraction mapping vector (Delta0).
There seem to be two possibilities:
1. The Obj module is called before the Jacobian module is called. Then the contraction mapping is run before the Jacobian is called and the Jacobian values are correct.
2. The Obj module is not called before the Jacobian module is called. I either do not use the Jacobian, or run the contraction mapping twice (which is computationally burdensome and really inefficient).
From the documentation, it is not clear how this works. Comparing this to other matrix languages is difficult, since they typically allow for coding up the objective function and gradient function in one piece.
Is there anyone who can help me with this?
This is documented in the section "Objective Function and Derivatives."
The doc states:
In many applications, calculations used in the computation of f can help compute derivatives at the same point efficiently. You can save and reuse such calculations with the GLOBAL clause. As with many other optimization packages, the subroutines call the "grd," "hes," or "jac" modules only after a call of the "fun" module.
So the answer is "1. The Obj module is called before the Jacobian module is called. Then the contraction mapping is run before the Jacobian is called and the Jacobian values are correct. "
This is documented in the section "Objective Function and Derivatives."
The doc states:
In many applications, calculations used in the computation of f can help compute derivatives at the same point efficiently. You can save and reuse such calculations with the GLOBAL clause. As with many other optimization packages, the subroutines call the "grd," "hes," or "jac" modules only after a call of the "fun" module.
So the answer is "1. The Obj module is called before the Jacobian module is called. Then the contraction mapping is run before the Jacobian is called and the Jacobian values are correct. "
Thanks!
Registration is open! SAS is returning to Vegas for an AI and analytics experience like no other! Whether you're an executive, manager, end user or SAS partner, SAS Innovate is designed for everyone on your team. Register for just $495 by 12/31/2023.
If you are interested in speaking, there is still time to submit a session idea. More details are posted on the website.
Learn how to run multiple linear regression models with and without interactions, presented by SAS user Alex Chaplin.
Find more tutorials on the SAS Users YouTube channel.