Skip to content

Instantly share code, notes, and snippets.

@mblondel
mblondel / lda_gibbs.py
Last active October 9, 2023 11:31
Latent Dirichlet Allocation with Gibbs sampler
"""
(C) Mathieu Blondel - 2010
License: BSD 3 clause
Implementation of the collapsed Gibbs sampler for
Latent Dirichlet Allocation, as described in
Finding scientifc topics (Griffiths and Steyvers)
"""
var parsetree = function(selector, options) {
this.el = document.querySelector(selector);
this.options = options || {};
if (this.el == null)
console.log("Could not find DOM object");
return this;
};
@karpathy
karpathy / min-char-rnn.py
Last active July 24, 2024 18:36
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@stephenturner
stephenturner / install-gcc48-linuxbrew-centos6.md
Last active March 6, 2022 02:49
Installing gcc 4.8 and Linuxbrew on CentOS 6

Installing gcc 4.8 and Linuxbrew on CentOS 6

The GCC distributed with CentOS 6 is 4.4.7, which is pretty outdated. I'd like to use gcc 4.8+. Also, when trying to install Linuxbrew you run into a dependency loop where Homebrew's gcc depends on zlib, which depends on gcc. Here's how I solved the problem.

Note: Requires sudo privileges.

Resources:

@dirko
dirko / keras_unidirectional_tagger.py
Created August 11, 2016 05:28
Keras LSTM NER tagger
# Keras==1.0.6
import numpy as np
from keras.models import Sequential
from keras.layers.recurrent import LSTM
from keras.layers.core import TimeDistributedDense, Activation
from keras.preprocessing.sequence import pad_sequences
from keras.layers.embeddings import Embedding
from sklearn.cross_validation import train_test_split
from sklearn.metrics import confusion_matrix, accuracy_score, precision_recall_fscore_support
@dirko
dirko / keras_bidirectional_tagger.py
Created August 11, 2016 05:32
Keras bidirectional LSTM NER tagger
# Keras==1.0.6
from keras.models import Sequential
import numpy as np
from keras.layers.recurrent import LSTM
from keras.layers.core import TimeDistributedDense, Activation
from keras.preprocessing.sequence import pad_sequences
from keras.layers.embeddings import Embedding
from sklearn.cross_validation import train_test_split
from keras.layers import Merge
from keras.backend import tf
@cbaziotis
cbaziotis / AttentionWithContext.py
Last active April 25, 2022 14:37
Keras Layer that implements an Attention mechanism, with a context/query vector, for temporal data. Supports Masking. Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf] "Hierarchical Attention Networks for Document Classification"
def dot_product(x, kernel):
"""
Wrapper for dot product operation, in order to be compatible with both
Theano and Tensorflow
Args:
x (): input
kernel (): weights
Returns:
"""
if K.backend() == 'tensorflow':