Skip to content

Instantly share code, notes, and snippets.

@yoshihikoueno
Created July 17, 2020 08:43
Show Gist options
  • Save yoshihikoueno/4ff0694339f88d579bb3d9b07e609122 to your computer and use it in GitHub Desktop.
Save yoshihikoueno/4ff0694339f88d579bb3d9b07e609122 to your computer and use it in GitHub Desktop.
code snippet to fix ``WARNING:tensorflow:Unresolved object in checkpoint: (root).optimizer.beta_1`` in tensorflow2.2
# workaround to fix opimizer bug in tensorflow
optimizer = tf.keras.optimizers.Adam(
learning_rate=tf.Variable(0.001),
beta_1=tf.Variable(0.9),
beta_2=tf.Variable(0.999),
epsilon=tf.Variable(1e-7),
)
optimizer.iterations # this access will invoke optimizer._iterations method and create optimizer.iter attribute
optimizer.decay = tf.Variable(0.0) # Adam.__init__ assumes ``decay`` is a float object, so this needs to be converted to tf.Variable **after** __init__ method.
///
The root problem is that Adam.__init__ will initialize variables with python float objects which will not be tracked by tensorflow.
We need to let them tracked and make them appear in Adam._checkpoint_dependencies in order to load weights without actually calling the optimizer itself.
By converting Python float to tf.Variable, they will be tracked because tf.Variable is a subclass of ``trackable.Trackable``.
///
@adv010
Copy link

adv010 commented Jul 17, 2021

Hi @lixuanhng , were you able to solve this problem? I'm encountering the same issue in another project. Yours is the most verbose account of this error that I'm facing.

Do reply if you ever able to solve, would be really helpful! Thanks!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment