05-25-2017 08:58 AM
I am using decision trees and trying to find out how can I use decision stump and conditional decision tree and M5.
Is that part of normal DT or part of HP tree etc or I just need to set some important parameters about them?
05-30-2017 03:24 PM
A 'stump' is simply a tree with one split. So, set MAXDEPTH=1 in PROC ARBOR or PROC HPFOREST to do that, or the equivalent depth option in HPSPLIT.
Assuming "Conditional Decision Trees" refers to the ideas in "conditional inference trees" (Hothorn, Hornik, and Zeileis 2006), then use PRESELECT=HOTHORN or PRESELECT=LOH in PROC HPFOREST. We did several simulations and concluded that PRESELECT=LOH is preferable.
06-04-2017 10:55 AM
I was able to apply Decision Stump but could not find PRESELECT option in any of the node properties to get conditional decision trees.
Would you please help me about it?
06-05-2017 09:12 AM
I also see no property in the Graphical User Interface. The 'conditional inference' method happens automatically because it is the default in the underlying PROC HPFOREST.
Before EM 14.2, PRESELECT=HOTHORN was the only choice.
Beginning in 14.2, if there are no nominal inputs, or if all the inputs have the about the same the number of categories, then the conditional inference method is not motivated and not done. Otherwise condition inference is done using the method of Wie-Yin Loh of University of Wisconson. (Technical detail: 'about the same number' means within 5 categories, with non-nominal inputs treated as having one category.)
06-05-2017 09:50 AM
You can set this option by running the following in the Enterprise Miner Project Start Code before running the HP Forest node:
%let EM_HPFOREST_PROCSTMNT = %str(PRESELECT=LOH);
06-05-2017 06:11 PM