https://arxiv.org/pdf/1506.06442v1.pdf
- NTram: Neural Transformation Machine for sequence-to-sequence learning
- sequence-to-sequence examples: machine translation, part-of-speech labeling, dependency parsing
Machine translation approaches:
- Encoder-Decoder (usually using RNN or CNN)
- Automatic Alignment (bidirection RNN prepares vectors for a gating NN output, better since vector is not fixed-length)