Skip to content

Instantly share code, notes, and snippets.

What would you like to do?
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Copy link

genho commented Apr 21, 2016

Hi Rajiv, thank you so much for your example.
May I know the reason for using seq2seq.rnn_decoder() in your code? I've tried search on the web and many of the examples are related to language translation. And I really cannot find documentation which talks about the TensorFlow seq2seq class.
Thanks a lot.

Copy link

rajshah4 commented Apr 30, 2016

I found the documentation deep in the tensorflow ops code
It explains how the decoder operates. Does this help?

Copy link

genho commented May 9, 2016

Got it. Thank you so much 👍

Copy link

JackMedley commented Aug 22, 2016

Hi Rajiv,
Can I ask what the purpose of the dropout layer is in a problem such as this? When training for something like addition don't we need to know all of the inputs?

Copy link

rajshah4 commented Aug 27, 2016

Hmm, its a good question. This was one of my first RNNs and I just grabbed code from other projects. I am thinking that it would work like dropout generally, it would help against overfitting and get a better sense of how addition works. If you have the time, I would be curious if you played around with the dropout whether it works like that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment