Last active
July 29, 2022 15:03
-
-
Save EndruK/d5943eebc4319827ed4605f56c4bcad4 to your computer and use it in GitHub Desktop.
links to interesting ML NLP techniques
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Skip-gram model: http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/ | |
Skip-gram model(negative sampling): http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/ | |
TFGAN(framework to train & evaluate GANs) https://research.googleblog.com/2017/12/tfgan-lightweight-library-for.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+blogspot/gJZg+(Official+Google+Research+Blog) | |
NLP-Formula: https://explosion.ai/blog/deep-learning-formula-nlp | |
https://medium.com/dair-ai/textql-colorless-green-rnns-convai2-machine-learning-yearning-meta-learning-tutorial-tinn-d85e64d3b6fb | |
https://github.com/outcastofmusic/quick-nlp | |
Bidirectional LSTM tensorflow | |
https://www.svds.com/tensorflow-rnn-tutorial/ | |
Tensorflow Dataset tutorial | |
http://adventuresinmachinelearning.com/tensorflow-dataset-tutorial/ | |
Sehr gutes 50lines seq2seq model mit attention | |
http://www.thushv.com/natural_language_processing/neural-machine-translator-with-50-lines-of-code-using-tensorflow-seq2seq/ | |
Seq2seq with attention explanation | |
https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/ | |
seq2seq with tensorflow | |
https://colab.research.google.com/github/tensorflow/tensorflow/blob/master/tensorflow/contrib/eager/python/examples/nmt_with_attention/nmt_with_attention.ipynb#scrollTo=yJ_B3mhW3jFk | |
attention model | |
https://acv.aixon.co/attention_word_to_number.html | |
visualization of neural models | |
https://github.com/lutzroeder/Netron | |
Seraj seq2seq model tutorial | |
https://www.youtube.com/watch?v=ElmBrKyMXxs | |
https://github.com/llSourcell/seq2seq_model_live/blob/master/2-seq2seq-advanced.ipynb | |
own rnn loop tutorial | |
https://hanxiao.github.io/2017/08/16/Why-I-use-raw-rnn-Instead-of-dynamic-rnn-in-Tensorflow-So-Should-You-0/ | |
practical seq2seq | |
https://suriyadeepan.github.io/2016-12-31-practical-seq2seq/ | |
LSTM detailed explanation | |
https://blog.echen.me/2017/05/30/exploring-lstms/ | |
Gluon Tree LSTM implementation | |
https://gluon.mxnet.io/chapter09_natural-language-processing/tree-lstm.html | |
Node2Vec | |
https://github.com/aditya-grover/node2vec | |
CopyMechanism | |
https://ireneli.eu/2018/06/25/to-copy-or-not-that-is-the-question-copying-mechanism/ | |
CopyNet | |
https://github.com/lspvic/CopyNet/blob/master/copynet.py | |
Very Nice TF Seq2Seq example | |
https://github.com/udacity/deep-learning/blob/master/seq2seq/sequence_to_sequence_implementation.ipynb |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment