Skip to content

Instantly share code, notes, and snippets.

View j-min's full-sized avatar

Jaemin Cho j-min

View GitHub Profile
@j-min
j-min / test_single_gpu.py
Created November 6, 2016 13:51
TensorFlow single GPU example
from __future__ import print_function
'''
Basic Multi GPU computation example using TensorFlow library.
Author: Aymeric Damien
Project: https://github.com/aymericdamien/TensorFlow-Examples/
'''
'''
This tutorial requires your machine to have 1 GPU
"/cpu:0": The CPU of your machine.
@j-min
j-min / test_multi_gpu.py
Last active January 15, 2022 01:57
TensorFlow multi GPU example
from __future__ import print_function
'''
Basic Multi GPU computation example using TensorFlow library.
Author: Aymeric Damien
Project: https://github.com/aymericdamien/TensorFlow-Examples/
'''
'''
This tutorial requires your machine to have 2 GPUs
"/cpu:0": The CPU of your machine.
@j-min
j-min / tmux_install.sh
Last active September 24, 2021 12:35
tmux 2.6 install script (linux)
TMUX_VERSION=2.6
cd $HOME
# Dependencies
sudo apt install libevent-dev ncurses-dev -y
# Download tmux
wget https://github.com/tmux/tmux/releases/download/$TMUX_VERSION/tmux-$TMUX_VERSION.tar.gz
tar -xvzf tmux-$TMUX_VERSION.tar.gz
@j-min
j-min / CaffeInstallation.md
Created September 24, 2019 06:41 — forked from arundasan91/CaffeInstallation.md
Caffe Installation Tutorial for beginners

Caffe

Freshly brewed !

With the availability of huge amount of data for research and powerfull machines to run your code on, Machine Learning and Neural Networks is gaining their foot again and impacting us more than ever in our everyday lives. With huge players like Google opensourcing part of their Machine Learning systems like the TensorFlow software library for numerical computation, there are many options for someone interested in starting off with Machine Learning/Neural Nets to choose from. Caffe, a deep learning framework developed by the Berkeley Vision and Learning Center (BVLC) and its contributors, comes to the play with a fresh cup of coffee.

Installation Instructions (Ubuntu 14 Trusty)

The following section is divided in to two parts. Caffe's documentation suggest

@j-min
j-min / korail_example.py
Last active September 15, 2019 09:43
Korean express train ticket reservation example
# pip install korail2
# https://github.com/carpedm20/korail2
from korail2 import Korail, NoResultsError, KorailError
from time import sleep
import os
# Login
EMAIL = '' # email
PW = '' # password
@j-min
j-min / download_video.py
Created July 1, 2019 05:19
Youtube Video Download & Trim
# https://github.com/mps-youtube/pafy
import pafy
if __name__ == '__main__':
video_url = 'https://www.youtube.com/watch?v=D_Ij3fAps4s'
video = pafy.new(video_url)
print(video.title, video.duration)
best = video.getbest()
best.download(quiet=False)
@j-min
j-min / .vimrc
Last active April 28, 2019 19:56
vimrc plugin settings for Python / JavaScript
set nocompatible " required
" filetype off " required
syntax on
syntax enable
" Monokai color scheme
" mkdir -p ~/.vim/colors
" wget https://raw.githubusercontent.com/crusoexia/vim-monokai/master/colors/monokai.vim ~/.vim/colors
colorscheme monokai
@j-min
j-min / RNN_hunkim's_tutorial_BasicRNNCell.ipynb
Last active December 11, 2018 02:06
TensorFlow 0.9 implementation of BasicRNNCell based on hunkim's tutorial
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@j-min
j-min / backprop.ipynb
Created March 1, 2017 14:25
Simple backprop implementation in TensorFlow without its optimizer API
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@j-min
j-min / exp_lr_scheduler.py
Created June 25, 2017 14:07
learning rate decay in pytorch
# http://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html
def exp_lr_scheduler(optimizer, epoch, init_lr=0.001, lr_decay_epoch=7):
"""Decay learning rate by a factor of 0.1 every lr_decay_epoch epochs."""
lr = init_lr * (0.1**(epoch // lr_decay_epoch))
if epoch % lr_decay_epoch == 0:
print('LR is set to {}'.format(lr))
for param_group in optimizer.param_groups: