Please enable JavaScript.
Coggle requires JavaScript to display documents.
MACHINE LEARNING ML3 (Linear Regression (t = f(x) (Vector Form ($$ t =…
MACHINE LEARNING ML3
-
Classification
Probabilistic
Bayes
-
-
Naive-Bayes
Assumes the components of xnew are independent, so the likelihood can be computed as a product
Easier to fit D uni-variate distributions than it is one D-dimensional one, so used for high values of D
Likelihood
-
$$ \prod_{d=1}^{D} N(\mu_{kd}, \sigma_{kd}^2 ) $$
Logistic Regression
Use regular output of f(x_new; w) and use a function h() to "squash" it between 0 and 1 to be a probability
-
-
Output a confidence level in the classification, not just a label
Non-probabilistic
K Nearest Neighbours
-
-
Process
Choose K
-
Larger K - smoother class boundaries, risk of extinguishing some classes if the no. of examples in them is very small
Use cross-validation, choose K with least misclassifications
For a test object, find the K nearest points in the TRAINING set to it
-
-
-
Performance
-
-
-
Confusion Matrix
Table with coumns the actual class, rows the predicted class, and values are TP, TN, FP, FN
-
-
-
-
Modelling Errors
-
RANDOM VARIABLES
-
-
Joint Probabilities
-
-
If X and Y are independent, then P(X, Y) = P(X) * P(Y)
If X and Y are dependent, then we must use CONDITIONAL PROBABILITIES
-
-
Clustering
K-means
-
-
-
No definitive solution - > need to use an iterative, nondeterministic algorithm
In each iteration you update the centroids/means and then re-assign points to clusters until you detect convergence
-
-
-