Skip to content

Instantly share code, notes, and snippets.

@KimMilim
Last active January 22, 2020 00:31
Show Gist options
  • Save KimMilim/80010e019229654f7d3e4835f3653cb3 to your computer and use it in GitHub Desktop.
Save KimMilim/80010e019229654f7d3e4835f3653cb3 to your computer and use it in GitHub Desktop.
NN(Neural Networks)

NN(Neural Network)

NN으로는 자연어처리 불가 여기 더 채우기

RNN(Recurrent Neural Network)

When we use RNN

  • handling sequence data (Sentence, Genome, voice signal, sensor sensing data etc..) In the case of a sentence, the meaning of the current word is interpreted not through the current word alone, but through its relation to the preceding word.

RNN_기본

This picture shows the sequencial input to the RNN. A's output inputs to the A . It is called by 'recursion' or 'recurrent'.

Problem of Long-Term Dependency

RNN_1

For example, the sentence is "The clouds are in the sky". There is a high probability that "sky" will follow "The clouds are in the".
If the information related to the words to be predicted is close, the prediction will be smooth.

RNN_2

But not if information related to the words to be predicted is far away. This picture shows that.

So we use LSTM to solving Problem of Long-Term Dependency.

LSTM(Long short term memory)

When we use LSTM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment