Please enable JavaScript.
Coggle requires JavaScript to display documents.
ML PYTHON - Coggle Diagram
ML PYTHON
Nonlinear Models
•Know how k-nearest-neighbors works (you should be able to perform it on paper if given the distances between instances), and know that it is a nonlinear method.
-
-
-
•Understand how to reduce overfitting in decision trees (limiting the maximum depth and limiting the minimum number of instances at a node).
-
-
Training
•You should understand how gradient descent works. You will be provided with the update equation for gradient descent.
-
-
•Understand the concept of regularization, and understand L2 vs L1 (don’t need to remember the exact formulas, but understand what they represent).
•Bias/variance tradeoff: know that high variance can lead to overfitting and high bias can lead to underfitting, and increasing one will generally decrease the other.
•Understand “all-vs-rest” and “all-pairs” (aka “one-vs-one”) for using binary classifiers for multiclass classification.
•You should understand what a derivative of a function means, andknow the definition of a gradient (the vector of derivatives with respect to each variable), though you will not need to calculate a derivative.
-
-
Linear Models
•You should understand the “score” wTx+b, and understand the definition of a dot product.
Score
z = wTx+b
Continuous outputs aka "scores" used as inputs for the activation function to predict discrete (1,0) classes
-
-
-
•You should understand the prediction rule for perceptron, in that the positive class is predicted if the score is ≥0 and the negative class is predicted otherwise.
•You should understand that support vector machines try to increase the size of the margin. Understand what a margin refers to visually, and understand why it is related to regularization.
-
-
-
-