Please enable JavaScript.
Coggle requires JavaScript to display documents.
K-Nearest Neighbors (Background (Why
Lazy
Learning? (No abstraction &…
K-Nearest Neighbors
Background
-
-
-
Why
Lazy
Learning?
-
-
Known as instance-based learning
Definition
-
Parameter K
- Number of dataset items that
considered for classificaiton
Distance
Function
Manhattan Distance
Euclidean Distance
How to
Choose k?
When k is small
- Bias is small
- Variance is large
- Undersmoothing
- High Complexity
When k increasing
- Average over
more instances
- Variance decrease
- Bias increasing
- Over smoothing
- Low complexity
-
Disadvantage
-
-
No training stage, work done during test stage
-