Yeah, that's what I figured, but it shouldn't be an issue, since xlag is equal to zero. I understand why it would be a problem with this model: proc varmax data=sim_data noprint; model spend1 spend2 prod1 prod2 = time_ p=2 xlag=1; run; since there would be no way to differentiate the impact of time_period(t) and time_period(t-1) But that's not the model I'm running. I feel like the SAS internals are running some model with multiple lagged exogenous variables in the forecasting step, instead of running the model I want. If forecasting is going wrong, I'd like some way of stopping the VARMAX internals from attempting to forecast anything, since that's not of interest in this case. Of course, via the frisch-Waugh theorem, I could just detrend the endogenous varialbes and then drop time_period from the VAR, but I'd prefer not to... especially as I've run into similar issues with VARMAX and dummy variables who are not in their own right colinear, but for whom a larger set of lagged values are.
... View more