Skip to content

Instantly share code, notes, and snippets.

View deepgradient's full-sized avatar
💭
Implementing the DL & ML code.

Ermia Azarkhalili deepgradient

💭
Implementing the DL & ML code.
View GitHub Profile
@twiecki
twiecki / GLM-hierarchical-jax.ipynb
Last active September 7, 2022 23:21
notebooks/GLM-hierarchical.ipynb
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
#!/usr/bin/env python3
# Author: Alexandre Défossez, 2020
# This is free and unencumbered software released into the public domain.
# For more information, please refer to <http://unlicense.org/>
"""
Merge multiple bibfiles, remove duplicates and unused references, matching bibtex entries
based on the 'title' field. Rewrite all the .tex files in the current directory
to reflect the elimination of duplicates.
Finally, this will rewrite all the arXiv references to use the @unpublished category.
# observed data
total_rock = tf.constant(5., tf.float32)
total_paper = tf.constant(0., tf.float32)
total_scissors = tf.constant(0., tf.float32)
# define some constants
number_of_steps = 10000
burnin = 5000
# Set the chain's start state
# We calculate the joint log probability.
# It answers the big and central question in each Markov chain simulation: What's the probability
# that this data occurs together with these distribution parameters?
@tf.function(input_signature=5 * (tf.TensorSpec(shape=[], dtype=tf.float32),))
def joint_log_prob(total_rock, total_paper, total_scissors, p_rock, p_paper):
'''
Joint log probability of data occuring together with given parameters.
:param total_rock: number of rock occurences
:param total_paper: number of paper occurences
@Allgoerithm
Allgoerithm / rock-paper-scissors-in-tfp.ipynb
Last active January 28, 2020 12:43
Rock, Paper, Scissors in TFP
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@amberjrivera
amberjrivera / Pipeline-guide.md
Created January 26, 2018 05:02
Quick tutorial on Sklearn's Pipeline constructor for machine learning

If You've Never Used Sklearn's Pipeline Constructor...You're Doing It Wrong

How To Use sklearn Pipelines, FeatureUnions, and GridSearchCV With Your Own Transformers

By Emily Gill and Amber Rivera

What's a Pipeline and Why Use One?

The Pipeline constructor from sklearn allows you to chain transformers and estimators together into a sequence that functions as one cohesive unit. For example, if your model involves feature selection, standardization, and then regression, those three steps, each as it's own class, could be encapsulated together via Pipeline.

Benefits: readability, reusability and easier experimentation.
@tokestermw
tokestermw / self_attention.py
Last active March 3, 2025 11:36
Implementation of self-attention in the paper "Attention Is All You Need" in TensorFlow.
"""Example TensorFlow code for Self-Attention mechanism.
Refs:
Attention Is All You Need
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
https://arxiv.org/abs/1706.03762
Transformer: A Novel Neural Network Architecture for Language Understanding
https://research.googleblog.com/2017/08/transformer-novel-neural-network.html
@mbollmann
mbollmann / attention_lstm.py
Last active August 22, 2024 07:06
My attempt at creating an LSTM with attention in Keras
class AttentionLSTM(LSTM):
"""LSTM with attention mechanism
This is an LSTM incorporating an attention mechanism into its hidden states.
Currently, the context vector calculated from the attended vector is fed
into the model's internal states, closely following the model by Xu et al.
(2016, Sec. 3.1.2), using a soft attention model following
Bahdanau et al. (2014).
The layer expects two inputs instead of the usual one:
@mbollmann
mbollmann / hidden_state_lstm.py
Created August 17, 2016 10:02
Keras LSTM that inputs/outputs its internal states, e.g. for hidden state transfer
from keras import backend as K
from keras.layers.recurrent import LSTM
class HiddenStateLSTM(LSTM):
"""LSTM with input/output capabilities for its hidden state.
This layer behaves just like an LSTM, except that it accepts further inputs
to be used as its initial states, and returns additional outputs,
representing the layer's final states.