Skip to content

Instantly share code, notes, and snippets.

View zizhaozhang's full-sized avatar
🏠
Working from home

zzz zizhaozhang

🏠
Working from home
View GitHub Profile
@damienpontifex
damienpontifex / tf-experiment-template.py
Last active March 9, 2021 09:43
A template for a custom tensorflow estimator and experiment with python3 typings for desired parameter types
import argparse
import psutil
import tensorflow as tf
from typing import Dict, Any, Callable, Tuple
## Data Input Function
def data_input_fn(data_param,
batch_size:int=None,
shuffle=False) -> Callable[[], Tuple]:
"""Return the input function to get the test data.
@panovr
panovr / finetune.py
Created March 2, 2017 23:04
Fine-tuning pre-trained models with PyTorch
import argparse
import os
import shutil
import time
import torch
import torch.nn as nn
import torch.nn.parallel
import torch.backends.cudnn as cudnn
import torch.optim
@farrajota
farrajota / multiple_learning_rates.lua
Last active April 10, 2018 16:47
Example code for how to set different learning rates per layer. Note that when calling :parameters(), the weights and bias of a given layer are separate, consecutive tensors. Therefore, when calling :parameters(), a network with N layers will output a table with N*2 tensors, where the i'th and i'th+1 tensors belong to the same layer.
-- multiple learning rates per network. Optimizes two copies of a model network and checks if the optimization steps (2) and (3) produce the same weights/parameters.
require 'torch'
require 'nn'
require 'optim'
torch.setdefaulttensortype('torch.FloatTensor')
-- (1) Define a model for this example.
local model = nn.Sequential()
model:add(nn.Linear(10,20))
@karpathy
karpathy / min-char-rnn.py
Last active July 6, 2024 15:48
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)