Dropout: A simple way to prevent neural networks from overfitting.

Published in JMLR, 2014

Recommended citation: Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov, JMLR 2014 https://jmlr.org/papers/v15/srivastava14a.html

Randomly dropping out neurons during training reduces overfitting.

Download paper here

Recommended citation: Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov, JMLR 2014.