Skip to content

Instantly share code, notes, and snippets.

View pinzhenx's full-sized avatar

Pinzhen Xu pinzhenx

  • Apple
View GitHub Profile
@pinzhenx
pinzhenx / memstat.py
Created February 19, 2022 08:09
How is MemAvailable calculated in Linux? Python version of the kernel calculating the memory stats
pgsize = 4 # kb
total_lowwmark = 0
total_reserved = 0
with open('/proc/zoneinfo') as zoneinfo:
for line in zoneinfo:
if 'pages free' in line:
next(zoneinfo)
low = int(next(zoneinfo).split()[1])
high = int(next(zoneinfo).split()[1])
total_lowwmark += low
@pinzhenx
pinzhenx / conv_denorm.ipynb
Last active November 5, 2021 14:01
conv_denorm.ipynb
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@pinzhenx
pinzhenx / mirror.ipynb
Last active August 16, 2021 07:17
Anatomy of mirror strategy
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
import torch
import pandas as pd
def profile(m, x, nwarm=10, nrun=300):
for _ in range(nwarm):
m(x)
with torch.autograd.profiler.profile(True) as prof:
for _ in range(nrun):
m(x)
return getattr(prof.key_averages()[0], 'cpu_time') / 1000
  1. A simple example of how const prop works
def foo(x):
    a = 1 + 2
    b = a + 3
    c = b + 4
    return x + c

jit_foo = torch.jit.script(foo)
<html>
<script src='../dist/webml-polyfill.js'></script>
<script src='third_party/protobuf.min.js'></script>
<script src='util/base.js'></script>
<script src='util/onnx/onnx.js'></script>
<script src='util/onnx/OnnxModelUtils.js'></script>
<script src='util/onnx/OnnxModelImporter.js'></script>
<script>
(async () => {
const res = await fetch('path/to/model.onnx');