Please enable JavaScript.
Coggle requires JavaScript to display documents.
ANN - Coggle Diagram
ANN
Activation
Function
Sigmod (Logistic Function)
map the input values [-inf, +inf] to an output value in [0,1] interval. Activation function is needed to introduce not linearity in the model
Binary classification
Multilabel Classification
Softmax
Multiclass Classification
normalize the output to a probability distribution. All elements will be in an interval [0,1] and will sum up to 1
tanh
has y centered on 0. Learning faster compared to sigmoid
Relu: derivate is 0 per x <0 and 1 per x>0
Optimizers
Adam
SGD
Adagrad
Loss Function
Probabilistic Losses
Binary Crossentropy
Categorical Crossentropy
Regression Losses
Meas SquaredError
Mean AbsoluteError
Overfitting and Regularization
Tensorflow Playground:
https://bit.ly/3dfBS8F