NN으로는 자연어처리 불가 여기 더 채우기
- handling sequence data (Sentence, Genome, voice signal, sensor sensing data etc..) In the case of a sentence, the meaning of the current word is interpreted not through the current word alone, but through its relation to the preceding word.
This picture shows the sequencial input to the RNN. A's output inputs to the A . It is called by 'recursion' or 'recurrent'.
For example, the sentence is "The clouds are in the sky".
There is a high probability that "sky" will follow "The clouds are in the".
If the information related to the words to be predicted is close, the prediction will be smooth.
But not if information related to the words to be predicted is far away. This picture shows that.
So we use LSTM to solving Problem of Long-Term Dependency.