Skip to content

Instantly share code, notes, and snippets.

View krishvishal's full-sized avatar
🎯
Focusing

Krishna Vishal krishvishal

🎯
Focusing
View GitHub Profile
import socketserver
import urllib.parse
import http.server
PORT = 4000
data_base = {}
"""
Before your interview, write a program that runs a server that is accessible on
http://localhost:4000/. When your server receives a request on http://localhost:4000/set?somekey=somevalue
@krishvishal
krishvishal / README.md
Created June 15, 2023 04:22 — forked from moble/README.md
Speed up execution of `@everywhere` in julia

As described in detail here, julia can take really excessive amounts of time to execute the first @everywhere statement on many processes — around 1 hour for thousands of processes — even if the actual code being executed everywhere is trivial. Basically, the Distributed functions need to be precompiled to make this happen quickly.

This gist provides a simple way to do so — at least on Slurm clusters (though the same principles should apply elsewhere). Just submit precompile.jl as a batch job (adjusting the SBATCH directives as needed), and it should create a sysimage that you can use to run future batch jobs. Check end of the log of the Slurm job to see exactly how to use the sysimage.

Note that both the original julia process and all processes created with addprocs should use the --sysimage=/path/to/sys_everywhere.so argument. Doing so reduces the time taken to execute the first @everywhere sta

@krishvishal
krishvishal / high_low_spread_estimator.py
Created February 20, 2022 02:48 — forked from nicklatin/high_low_spread_estimator.py
Computes the high-low spread estimator, an estimate of bid-offer spreads, a measure of liquidity risk. See Corwin & Schultz (2011) for details: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1106193
# high-low spread estimator (hlse)
def hlse(ohlc_df, frequency='daily'):
"""
Computes the high-low spread estimator, an estimate of bid-offer spreads, a measure of liquidity risk.
See Corwin & Schultz (2011) for details: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1106193
Parameters
----------
ohlc_df: DataFrame
DataFrame with DatetimeIndex and Open, High, Low and Close (OHLC) prices from which to compute the high-low spread estimates.
@krishvishal
krishvishal / turing-ad-benchmark.jl
Created August 30, 2021 15:43 — forked from torfjelde/turing-ad-benchmark.jl
Convenient code for benchmarking different AD-backends on a particular Turing.jl model.
# Use packages to ensure that we trigger Requires.jl.
using Zygote: Zygote
using ReverseDiff: ReverseDiff
using ForwardDiff: ForwardDiff
using Tracker: Tracker
using Memoization: Memoization # used for ReverseDiff.jl cache.
using Turing.Core: ForwardDiffAD, ReverseDiffAD, TrackerAD, ZygoteAD, CHUNKSIZE
const DEFAULT_ADBACKENDS = [
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@krishvishal
krishvishal / rdiff_test.jl
Created August 11, 2020 11:01
This file has a function f(x) and I use ReverseDiff to calculate gradient of that function. If ReveseDiff is really the problem, this shouldn't run successfully.
using ReverseDiff
f(x) = maximum(x;dims=1)
if abspath(PROGRAM_FILE) == @__FILE__
x = rand(3, 3)
println(x)
println(ReverseDiff.gradient(sum∘f, x))
end
@krishvishal
krishvishal / getsize.py
Created November 14, 2019 09:40
Size of a python object even with recursive encapsulation
def getsize(obj_0):
"""Recursively iterate to sum size of object & members."""
_seen_ids = set()
def inner(obj):
obj_id = id(obj)
if obj_id in _seen_ids:
return 0
_seen_ids.add(obj_id)
size = sys.getsizeof(obj)
if isinstance(obj, zero_depth_bases):
@krishvishal
krishvishal / plot_kernels.py
Last active October 22, 2022 18:27
Visualize weights in pytorch
from model import Net
from trainer import Trainer
import torch
from torch import nn
from matplotlib import pyplot as plt
model = Net()
ckpt = torch.load('path_to_checkpoint')
model.load_state_dict(ckpt['state_dict'])
filter = model.conv1.weight.data.numpy()