Skip to content

Instantly share code, notes, and snippets.

@RedSunCMX
RedSunCMX / readme.md
Created July 13, 2019 06:36 — forked from flyyufelix/readme.md
Resnet-152 pre-trained model in Keras

ResNet-152 in Keras

This is an Keras implementation of ResNet-152 with ImageNet pre-trained weights. I converted the weights from Caffe provided by the authors of the paper. The implementation supports both Theano and TensorFlow backends. Just in case you are curious about how the conversion is done, you can visit my blog post for more details.

ResNet Paper:

Deep Residual Learning for Image Recognition.
Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun
arXiv:1512.03385
@RedSunCMX
RedSunCMX / min-char-rnn.py
Created December 6, 2016 07:50 — forked from karpathy/min-char-rnn.py
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@RedSunCMX
RedSunCMX / lstm-lm.py
Last active August 29, 2015 14:11 — forked from neubig/lstm-lm.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# This is a simplified implementation of the LSTM language model (by Graham Neubig)
#
# LSTM Neural Networks for Language Modeling
# Martin Sundermeyer, Ralf Schlüter, Hermann Ney
# InterSpeech 2012
#
# The structure of the model is extremely simple. At every time step we
"""
From this paper: http://acl.ldc.upenn.edu/acl2004/emnlp/pdf/Mihalcea.pdf
I used python with nltk, and pygraph to do an implmentation of of textrank.
for questions: http://twitter.com/voidfiles
"""
import nltk
import itertools