Joshua Shunk

United States of America

Society for Science & the Public (ISEF)

Neuron-Specific Dropout: A Deterministic Regularization Technique to Prevent Neural Networks from Overfitting & Reduce Dependence on Large Training Samples

In order to develop complex relationships between their inputs and outputs, deep neural networks train and adjust large number of parameters. To make these networks work at high accuracy, vast amounts of data are needed. Sometimes, however, the quantity of data needed is not present or obtainable for training. Neuron-specific dropout (NSDropout) is a tool to address this problem. NSDropout looks at both the training pass, and validation pass, of a layer in a model. A pass occurs when a single image, or group of images, goes through all the layers in the network. By comparing the average values produced by each neuron during the training and validation pass, the network is able to recognize areas that are causing the model to overfit. The NSDropout layer is able to predict what features, or noise, the model is looking at during testing that isn’t present when looking at samples from validation. Neuron-specific dropout has proved to achieve similar, if not better, testing accuracy with far less data than traditional methods including dropout and other regularization methods. Neuron-specific dropout reduces the chance of a network overfitting and reduces the need for large training samples on supervised learning tasks in image recognition.