- https://towardsdatascience.com/attention-and-its-different-forms-7fc3674d14dc
- https://towardsdatascience.com/light-on-math-ml-attention-with-keras-dc8dbc1fad39
- https://www.tensorflow.org/tutorials/text/transformer
- https://www.tensorflow.org/api_docs/python/tf/keras/layers/Attention
- https://machinetalk.org/2019/04/29/create-the-transformer-with-tensorflow-2-0/
- https://www.kaggle.com/miljan/stock-predictions-with-multi-head-attention/notebook
- https://rubikscode.net/2019/07/29/introduction-to-transformers-architecture/
- https://rubikscode.net/2019/08/05/transformer-with-python-and-tensorflow-2-0-attention-layers/
- https://mchromiak.github.io/articles/2017/Sep/12/Transformer-Attention-is-all-you-need/#.Xi3LJY7YqEs
- https://mc.ai/transformer-architecture-attention-is-all-you-need-2/
- https://www.kaggle.com/miljan/stock-predictions-with-multi-head-attention
- https://hergott.github.io/lstm-attention-bond-market-taper-tantrum/
Last active
January 26, 2020 17:34
-
-
Save logkcal/249a472eb4612b69ae99b4ee066a60f3 to your computer and use it in GitHub Desktop.
att.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment