Skip to content

Instantly share code, notes, and snippets.

@rajshah4
Last active March 4, 2018 01:43
Show Gist options
  • Star 5 You must be signed in to star a gist
  • Fork 3 You must be signed in to fork a gist
  • Save rajshah4/aa6c67944f4a43a7c9a1204301788e0c to your computer and use it in GitHub Desktop.
Save rajshah4/aa6c67944f4a43a7c9a1204301788e0c to your computer and use it in GitHub Desktop.
RNN_Addition_1stgrade
Display the source blob
Display the rendered blob
Raw
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@JackMedley
Copy link

Hi Rajiv,
Can I ask what the purpose of the dropout layer is in a problem such as this? When training for something like addition don't we need to know all of the inputs?
Thanks,
Jack

@rajshah4
Copy link
Author

Hmm, its a good question. This was one of my first RNNs and I just grabbed code from other projects. I am thinking that it would work like dropout generally, it would help against overfitting and get a better sense of how addition works. If you have the time, I would be curious if you played around with the dropout whether it works like that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment