I can't reconcile the variable importance using decision trees and regression on sas on demand for academics 6.2 miner. In particular I'm using the data PVA97NK. In otherwords, their ranking are not the same.
Any help is welcomed.
Thanks.
The regression and tree-based models work in very different ways. The regression builds a model that attempts to describe the relationship between the target and involves estimating parameters that are then multiplied by variable values to predict the outcome. Decision trees do not build a parametric model and must be evaluated differently.
There are also many differences in how regression and decision tree models perform on different data sets.
* Regression models do a better job of describing smoothly changing relationships than a decision tree which would need to create many cutpoints to accomplish the same.
* Decision trees handle missing values whereas regression models must have those values imputed meaning the two methods are not even working on the same training data when missing values are present in the original training data.
* Regression models require you to specify the exact functional form of the relationship whereas decision trees do not assume any specific relationship.
* Decision trees are (typically) far more flexible than regression models allowing them to automatically model complex interactions and non-linear relationships that would need to be described explicitly in a regression model.
* You cannot pick one of these methods as the superior method overall since it depends on the data you are fitting
For all these reasons, you should not expect that the variable importance reported by a decision tree is going to parallel what you would see from the corresponding regression model fit to the same data.
I hope this helps!
Doug
The regression and tree-based models work in very different ways. The regression builds a model that attempts to describe the relationship between the target and involves estimating parameters that are then multiplied by variable values to predict the outcome. Decision trees do not build a parametric model and must be evaluated differently.
There are also many differences in how regression and decision tree models perform on different data sets.
* Regression models do a better job of describing smoothly changing relationships than a decision tree which would need to create many cutpoints to accomplish the same.
* Decision trees handle missing values whereas regression models must have those values imputed meaning the two methods are not even working on the same training data when missing values are present in the original training data.
* Regression models require you to specify the exact functional form of the relationship whereas decision trees do not assume any specific relationship.
* Decision trees are (typically) far more flexible than regression models allowing them to automatically model complex interactions and non-linear relationships that would need to be described explicitly in a regression model.
* You cannot pick one of these methods as the superior method overall since it depends on the data you are fitting
For all these reasons, you should not expect that the variable importance reported by a decision tree is going to parallel what you would see from the corresponding regression model fit to the same data.
I hope this helps!
Doug
Save $250 on SAS Innovate and get a free advance copy of the new SAS For Dummies book! Use the code "SASforDummies" to register. Don't miss out, May 6-9, in Orlando, Florida.
Use this tutorial as a handy guide to weigh the pros and cons of these commonly used machine learning algorithms.
Find more tutorials on the SAS Users YouTube channel.