Please enable JavaScript.
Coggle requires JavaScript to display documents.
Ensemble
Learning (Introduction (No single model that will always do…
Ensemble
Learning
Introduction
-
-
By combining the strengths of multiple base-learners, accuracy can be improved
Diverse Learners
-
Generating DL
- Train different base-learners using
different learning algorithms
- Combination of parametric & non-parametric classifier
- Train base learners using same learning algorithm with different hyperparameters
- Train the base-learners using different input representations of the same event
-
Model Combination
Scheme
Multiexpert
Combination
-
-
Voting
-
All learners are given weights, \(w_j \)
, where j = classifier
Simple voting
- All classifiers have equal weight
-
Majority Voting
\( y_k = w_j d_{jk} \)
, where j = classifier
k = class
- Final output prediction is the one that
receives more than half of the votes
- If NO => ensemble method could not make
a stable prediction for this instance
-
-
-