Please enable JavaScript.
Coggle requires JavaScript to display documents.
Gradient Boosting - Coggle Diagram
Gradient Boosting
Additive strategy?
- These class of algorithms are descirbed as stage wise additive models
- Bcoz one new weak learner is added at a time and earlier model is frozen.
-
About
- Idea of boosting & ADABOOST are recasted as GBM by Friedman
- GBM are generic framework for classification and regression both unlike ADABOOST which is specifically for classification
What?Idea?
- Idea is to improve the weak learners by sequentially adding a model which fits on the residue of the earlier model
Intuition
- Behaves similarly as Gradient Descent, minimise the loss of the model by moving towards the negative gradient of the loss function.
- in regression we take the loss function as sum of squares, its gradient is the residual and we fit further models on the residual which in turns tries to minimise it.
High Level Steps
- find the residual and fit the new model on residual
- add the new model to the older model and continue
Note
- GBM is generic framework can be applied to both classification and regression
- Loss functions can be decided its not defined, generally it should be something which is differentiable
3 Elements
- A Loss function to be minimised
- A weak learner to make predictions
- An additive model to add weak learners to minimise the loss function
-