In general you can expect tf.nn.sparse_softmax_cross_entropy_with_logits
to be way better optimized because it processes logits directly instead of just applying xent to a probability distribution. Good news: it's trivial to use in Keras, when you need it.
Created
October 14, 2018 09:20
-
-
Save minhtriet/7375c5b14e7863751f4762d630c0b3d6 to your computer and use it in GitHub Desktop.
Some collected tips in DL implementation
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment