03-02-2016 03:37 AM
im working with the SAS Enterprise Miner. I try to develop a predictive model. Can someone explain the difference between the average squared error and the mean squared error. And why ist the ASE used as default if the target variable is interval?
03-02-2016 11:18 AM
Hi Adrian, I'm able to help you with a part of your question. MSE (mean squared error) is arguably the most important criterion used to evaluate the performance of a predictor or an estimator. (The subtle distinction between predictors and estimators is that random variables are predicted and constants are estimated.) It measures the average of the squares of the errors.
Were you potentially talking about Mean Absolute Error?
Because the Mean Absolute Error measures the averages of the mean error. You can use it to measure how close forecasts or predictions are to the eventual outcomes.