- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
What is weight in Gradient Boosting? Appreciate if someone point to me to some simple example of Gradient Boosting.
Thanks!
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
A weight is a positive number.
"Weighting observations" means a positive number is associated with each observation, and the algorithm utilizes that number somehow. Intuitively, observations with larger weights influence the algorithm more than observations with smaller weights. When Adaboost has created 10 trees in its boosting model, it will assign small weights to observations it is predicting well, so that Adaboost will create the 11th tree focusing on observations it hitherto predicted poorly.
"Weighting trees" means the predictions from the trees are multiplied by a weight: P(X) = W1 T1(X) + W2 T2(X),
where Ti(X) is the prediction of tree i for inputs X, and Wi are the weights. Sometimes the gradient boosting algorith is explained: first train the next tree (T2), and then find a single number (W2) that works best.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for the document, but still i don't understand the term 'weight'.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
"weight" could refer to weighting the observations or to weighting the trees in the model.
A boosting model typically consists of a sum of decision trees trained sequentially. Some algorithms describe the sum as weighted.
In Adaboost, the original boosting algorithm, observations are given weights before training a tree. The weights are different for each tree.
In gradient boosting algorithms do not use weights like this. Instead, the algorithm modifies the target values input to a tree.
The EM Boosting node uses gradient boosting. In some rare occassions, people assign weighted values to the observations at the start in order to match proportions of groups in the training data with those in a future population to which the model will be applied.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for the reply, however I still don't understand the meaning of 'weighting the observations or to weighting the trees'. Appreciate if you tell me in layman's terms
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
A weight is a positive number.
"Weighting observations" means a positive number is associated with each observation, and the algorithm utilizes that number somehow. Intuitively, observations with larger weights influence the algorithm more than observations with smaller weights. When Adaboost has created 10 trees in its boosting model, it will assign small weights to observations it is predicting well, so that Adaboost will create the 11th tree focusing on observations it hitherto predicted poorly.
"Weighting trees" means the predictions from the trees are multiplied by a weight: P(X) = W1 T1(X) + W2 T2(X),
where Ti(X) is the prediction of tree i for inputs X, and Wi are the weights. Sometimes the gradient boosting algorith is explained: first train the next tree (T2), and then find a single number (W2) that works best.