Skip to content

Instantly share code, notes, and snippets.

@rpatrik96
Last active January 21, 2019 19:51
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save rpatrik96/e317abd8caac22236cca8f5411d01fd2 to your computer and use it in GitHub Desktop.
Save rpatrik96/e317abd8caac22236cca8f5411d01fd2 to your computer and use it in GitHub Desktop.
Weight matrix-based regularization techniques for deep learning

Weight Matrix Modification - a regularization approach

General description:

Author: Patrik Reizinger

The project contains the source files (without the datasets) which implement WMM (Weight Matrix Modification,) a weight matrix-based regularization technique for Deep Neural Networks. In the following the proposed methods are shortly introduced, including the evalutaion framework.

Weight shuffling

Weight shuffling is based on the assumption that locally the coefficients of a weight matrix are correlated. Based on this, I hypothesize that shuffling the weight within a rectangular window - which is under the beforementioned assumption a way of adding correlated noise to the weights - may help reduce overfitting.

Weight reinitialization

Weight reinitialization aims to reduce overfitting while partially reinitializing the weight matrix, thus in the case of a non-representative training set it may reduce the over-/underestimation of the significance regarding specific input data.

License: GNU LESSER GENERAL PUBLIC LICENSE (Version 3, 29 June 2007)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment