Please enable JavaScript.
Coggle requires JavaScript to display documents.
Deep Learning, Data Preprocessing - Coggle Diagram
Deep Learning
Activation Function
Relu
it is higher gradient a = max(0,z)
Sigmoid
it is for prediction 0,1
tanh
tanh is better than sigmoid as it has -1,1
Logistic Regression
SiGMOID HELPER FUNCTION
Forward AND backward propgation
Backward Porpogation
:star:
Output Layer <- [Gradient Calculation] <- Hidden Layer <- [Gradient Calculation] <- Input Layer
*
Forward Propogation:
:star:
INPUT Layer -> [Weights and Biases] -> Hidden Layer -> [Weights and Biases] -> Output Layer
Optimized
Predicted
Model
Accuarcy
Data Preprocessing
Figure Out Dimensions
Reshape
Normailze Data