BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
Babloo
Rhodochrosite | Level 12

What is weight in Gradient Boosting? Appreciate if someone point to me to some simple example of Gradient Boosting.

 

Thanks!

1 ACCEPTED SOLUTION

Accepted Solutions
PadraicGNeville
SAS Employee

A weight is a positive number.

 

"Weighting observations" means a positive number is associated with each observation, and the algorithm utilizes that number somehow.  Intuitively, observations with larger weights influence the algorithm more than observations with smaller weights.  When  Adaboost has created 10 trees in its boosting model, it will assign small weights to observations it is predicting well, so that Adaboost will create the 11th tree focusing on observations it hitherto predicted poorly.

 

"Weighting trees" means the predictions from the trees are multiplied by a weight:  P(X) = W1 T1(X) + W2 T2(X),

where Ti(X) is the prediction of tree i for inputs X, and Wi are the weights.  Sometimes the gradient boosting algorith is explained: first train the next tree (T2), and then find a single number (W2) that works best.

View solution in original post

5 REPLIES 5
Babloo
Rhodochrosite | Level 12

Thanks for the document, but still i don't understand the term 'weight'.

PadraicGNeville
SAS Employee

"weight" could refer to weighting the observations or to weighting the trees in the model. 

 

A boosting model typically consists of a sum of decision trees trained sequentially.  Some algorithms describe the sum as weighted.

In Adaboost, the original boosting algorithm, observations are given weights before training a tree.   The weights are different for each tree.

 

In gradient boosting algorithms do not use weights like this.  Instead, the algorithm modifies the target values input to a tree.

 

The EM Boosting node uses gradient boosting.  In some rare occassions, people assign weighted values to the observations at the start in order to match proportions of groups in the training data  with those in a future population to which the model will be applied.

Babloo
Rhodochrosite | Level 12

Thanks for the reply, however I still don't understand the meaning of 'weighting the observations or to weighting the trees'. Appreciate if you tell me in layman's terms

PadraicGNeville
SAS Employee

A weight is a positive number.

 

"Weighting observations" means a positive number is associated with each observation, and the algorithm utilizes that number somehow.  Intuitively, observations with larger weights influence the algorithm more than observations with smaller weights.  When  Adaboost has created 10 trees in its boosting model, it will assign small weights to observations it is predicting well, so that Adaboost will create the 11th tree focusing on observations it hitherto predicted poorly.

 

"Weighting trees" means the predictions from the trees are multiplied by a weight:  P(X) = W1 T1(X) + W2 T2(X),

where Ti(X) is the prediction of tree i for inputs X, and Wi are the weights.  Sometimes the gradient boosting algorith is explained: first train the next tree (T2), and then find a single number (W2) that works best.

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

How to choose a machine learning algorithm

Use this tutorial as a handy guide to weigh the pros and cons of these commonly used machine learning algorithms.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 5 replies
  • 5357 views
  • 0 likes
  • 3 in conversation