That question is too vague.The complexity of the model (linear model vs nonlinear; fixed effects vs random) plays a role, so what SAS procedure, method, and options are you trying to use?
Nevertheless, here are a few thoughts.
The size of an opimization problem usually refers to the number of parameters that you are opimizing. For most regression-type problems that opimize the likelihood function, this means that number of effects in the model. Each classfication effects requires k-1 parameters, where k is the number of levels in the categorical variable.
The number of observations are important because that is how you form the X`X matrix that is used in regression. Many SAS procedure can use multithreaded code to fit this matrix, so the number of threads that you use can be important. MLE also has to run through the observations for each iteration. For generalized linear models, SAS will zip through hundreds of thousands of observations easily.
If I am optimizing a nonlinear function by using a SAS/IML NLP routine, I consider the poblem to be small if it has less than a dozen parameters. Medium problems might have a few dozen parameters, and large is more than that. I don't usually worry about the number of observations.
I think this is a good time for me to post a disclaimer: I work for SAS but I do not speak for SAS. If you provide more information about the procedure and syntax, we can provide better answers that are tuned to your problem.
, and the Small problems might generally be
... View more