Skip to content

Instantly share code, notes, and snippets.

View stas00's full-sized avatar

Stas Bekman stas00

View GitHub Profile
@stas00
stas00 / PushEvent.py
Created September 23, 2018 20:41 — forked from rubys/PushEvent.py
Allow git-multimail to run as a webhook for GitHub
#!/usr/bin/python
from __future__ import print_function
#
# A simple CGI script useful for debugging GitHub web hooks
# https://developer.github.com/webhooks/
#
import hashlib, hmac, json, os, sys, traceback
from subprocess import Popen, PIPE, STDOUT
@stas00
stas00 / wait_till_pip_ver_is_available.sh
Created October 25, 2018 02:41
poll pypi servers until a desired version of a specif cpackage becomes available
#!/usr/bin/env bash
#set -x
package="fastai"
ver="1.0.14"
# is_pip_ver_available "1.0.14"
# returns 1 if yes, 0 otherwise
function is_pip_ver_available() {
@stas00
stas00 / mnist_to_image_files.py
Created December 3, 2018 05:32
convert mnist digits or fashion db into jpg image file dataset like in imagenet dataset train/valid/test main subfolders with class number subfolders acting as labels
import pathlib, PIL, random, os, gzip
import numpy as np
def load_mnist(path, kind='train'):
"""Load MNIST data from `path`"""
labels_path = os.path.join(path, '%s-labels-idx1-ubyte.gz' % kind)
images_path = os.path.join(path, '%s-images-idx3-ubyte.gz' % kind)
with gzip.open(labels_path, 'rb') as lbpath:
@stas00
stas00 / pypi_module_version_is_available.py
Last active December 14, 2018 02:44
a little function that checks whether a given module version is availble on pypi servers
# usage:
# pypi_module_version_is_available("Pillow", "5.4.0")
import subprocess
def pypi_module_version_is_available(module, version):
"Check whether module==version is available on pypi"
# returns True/False (or None if failed to execute the check)
# using a hack that when passing "module==" w/ no version number to pip
# it "fails" and returns all the available versions in stderr
@stas00
stas00 / peak_mem_metric.py
Created January 10, 2019 03:03
PeakMemMetric - custom fastai metric that prints gpu/cpu ram consumption and peak info per training epoch
import tracemalloc, threading, torch, time, pynvml
from fastai.utils.mem import *
from fastai.vision import *
if not torch.cuda.is_available(): raise Exception("pytorch is required")
def preload_pytorch():
torch.ones((1, 1)).cuda()
def gpu_mem_get_used_no_cache():
@stas00
stas00 / CaptureStdout.py
Last active January 10, 2019 23:06
capture stdout context manager w/ handling of `\r` resets in the output, simpler to use than contextlib.redirect_stdout
import sys, re
from io import StringIO
# When any function contains print() calls that get overwritten, like progress bars,
# a special care needs to be applied, since under pytest -s captured output (capsys
# or contextlib.redirect_stdout) contains any temporary printed strings, followed by
# \r's. This helper function ensures that the buffer will contain the same output
# with and without -s in pytest, by turning:
# foo bar\r tar mar\r final message
# into:
@stas00
stas00 / pytorch-tracemalloc-get_traced_memory.py
Created January 12, 2019 18:32
pytorch tracemalloc.get_traced_memory equivalent (for this to work need to wait for https://github.com/pytorch/pytorch/pull/15985 to be merged and then it'll probably appear in pytorch 1.0.1)
import torch
def consume_gpu_ram(n): return torch.ones((n, n)).cuda()
def consume_gpu_ram_256mb(): return consume_gpu_ram(2**13)
def b2mb(x): return int(x/2**20)
class TorchTracemalloc():
def __enter__(self):
self.begin = torch.cuda.memory_allocated()
import os,requests,platform,json,subprocess
import tarfile
from zipfile import ZipFile
debug=False
os_type=platform.system().lower()
machine_type=platform.machine().lower()
if debug:print(f'Your OS and Machine Type is {os_type} and {machine_type}')
@stas00
stas00 / str2func.py
Last active February 20, 2019 06:54
Convert a string of a fully qualified function, class or module into its correspong python object, if such exists. See examples at the end.
import sys
def str2func(name):
"Convert a string of a fully qualified function, class or module into its python object"
if isinstance(name, str):
subpaths = name.split('.')
else:
return None
module = subpaths.pop(0)
if module in sys.modules:
@stas00
stas00 / gist:0ba25525df65497cc16fdea9fbda5bc4
Last active February 23, 2019 05:05
A plea for github to fix the CLA signing issue on the user side
Here is a support letter I have just sent to github [2019-02-22]:
--------------------->8---------------->8-------------------->8------------------
Hi,
I contacted you some 6 months ago and it doesn't look like this is of a
priority, but it's of a huge priority to tens of thousands of projects that now
require PR submitters to sign CLA before their PR can be accepted.
You probably don't realize that if I looked at the PR changes and the user then