Dropout: A simple way to prevent neural networks from overfitting.
Published in JMLR, 2014
Recommended citation: Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov, JMLR 2014 https://jmlr.org/papers/v15/srivastava14a.html
Randomly dropping out neurons during training reduces overfitting.
Recommended citation: Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov, JMLR 2014.