Skip to content

Instantly share code, notes, and snippets.

@dswah
Last active July 24, 2023 16:00
Show Gist options
  • Save dswah/9cee80674e2deea941430ec298f3f99e to your computer and use it in GitHub Desktop.
Save dswah/9cee80674e2deea941430ec298f3f99e to your computer and use it in GitHub Desktop.
Orthonormal Weight Constraint (tensorflow / keras)
import tensorflow as tf
class Orthonormal(tf.keras.constraints.Constraint):
"""approximate Orthonormal weight constraint.
Constrains the weights incident to each hidden unit
to be approximately orthonormal
# Arguments
beta: the strength of the constraint
# References
https://arxiv.org/pdf/1710.04087.pdf
"""
def __init__(self, beta=0.01):
self.beta = beta
def __call__(self, w):
eye = tf.linalg.matmul(w, w, transpose_b=True)
return (1 + self.beta) * w - self.beta * tf.linalg.matmul(eye, w)
def get_config(self):
return {'beta': self.beta}
@dswah
Copy link
Author

dswah commented Jun 8, 2020

Approximate orthonormal weight constraint.

During the constraint phase of each keras update loop, this constraint update will ensure that the weight matrix is almost orthonomal.
beta controls the intensity of the orthonormality constraint, the tradeoff being a decrease in the accuracy of the model.

To apply to a Dense layer, for example, do:

tf.keras.layers.Dense(100, kernel_constraint=Orthonormal(), use_bias=False)

An orthonomal matrix W has W.dot(W.T) == I
Orthonormality is more strict that orthogonality since the columns of W are also required to have unit norm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment