PERFORM EARLY STOPPING
•Stop training when you notice that the validation loss increases while training loss decreases.
DO REGULARIZATION
•Regularization improves the model generalization capability.
ADD MORE DATASET
•By increasing the size of the dataset, the model might generalize more.
PERFORM FEATURE SELECTION
•Dropping useless features could improve the model generalization capability.
USE BOOSTING AND BAGGING (ENSEMBLE LEARNING)
•By combining voting from many different models via bagging and boosting, this will improve model generalization
ADD MORE NOISE
•Adding noise might enable model to become more general
USE DROPOUT TECHNIQUE
•In Artificial Neural Network training, dropping some neurons using Dropout technique improves networks generalization ability.
PERFORM EARLY STOPPING
•Stop training when you notice that the validation loss increases while training loss decreases.
DO REGULARIZATION
•Regularization improves the model generalization capability.
ADD MORE DATASET
•By increasing the size of the dataset, the model might generalize more.
PERFORM FEATURE SELECTION
•Dropping useless features could improve the model generalization capability.
USE BOOSTING AND BAGGING (ENSEMBLE LEARNING)
•By combining voting from many different models via bagging and boosting, this will improve model generalization
ADD MORE NOISE
•Adding noise might enable model to become more general
USE DROPOUT TECHNIQUE
•In Artificial Neural Network training, dropping some neurons using Dropout technique improves networks generalization ability.
Neurons develop co-dependency amongst each other during training .Dropout is a regularization technique for reducing overfitting in neural networks.
Remove ANNs Layers:
Another technique is to reduce the complexity of the ANN by removing layers, rather than adding them. This might also help prevent an overly complex model from being created. When to use KPL->If your application does not log records to a local file, and it creates a large number of small records per second, consider using the KPL.