Skip to content

Instantly share code, notes, and snippets.

View kachayev's full-sized avatar
🇺🇦
Fighting for freedom

Oleksii Kachaiev kachayev

🇺🇦
Fighting for freedom
View GitHub Profile
@kachayev
kachayev / concurrency-in-go.md
Last active September 23, 2025 16:12
Channels Are Not Enough or Why Pipelining Is Not That Easy
@kachayev
kachayev / dijkstra.py
Last active March 4, 2025 23:42
Dijkstra shortest path algorithm based on python heapq heap implementation
from collections import defaultdict
from heapq import *
def dijkstra(edges, f, t):
g = defaultdict(list)
for l,r,c in edges:
g[l].append((c,r))
q, seen, mins = [(0,f,())], set(), {f: 0}
while q:
## Alexey Kachayev, 2014
## Link to slides:
## http://goo.gl/n4ylC4
## Basic:
## type Parser = String -> Tree
## Composition
## type Parser = String -> (Tree, String)
@kachayev
kachayev / topological.py
Last active December 30, 2022 10:21
Topological sort with Python (using DFS and gray/black colors)
# Simple:
# a --> b
# --> c --> d
# --> d
graph1 = {
"a": ["b", "c", "d"],
"b": [],
"c": ["d"],
"d": []
}
@kachayev
kachayev / aleph-planning.md
Last active December 12, 2022 16:28
A few thoughts on Aleph development

Aleph, Async, HTTP, Clojure

I've been working with Aleph rougly for last 5 years, actively contributing to the library for last 2 (or so). I also put some effort into spreading the word about it, including educational tech talks, like "Deep HTTP Dive Throught Aleph & Netty". But the more I talk to people the more confusion I find, mostly about how Aleph works and what can you expect when adding it to your stack. Clojurists Together has recently announced Aleph to get Q1 funding, I think it's a good time to share my priorities and thoughts on development plans that were mentioned in the blog post. Hope the community would find it interesting and helpful.

Aleph describes itself as "asynchronous communication for Clojure" library. And you should probably pay a good portion of your attention to the first word: "asynchronous".

@kachayev
kachayev / css-parser.md
Last active November 12, 2022 04:20
Parsing CSS file with monadic parser in Clojure
@kachayev
kachayev / logic.py
Last active July 31, 2022 12:26
Lazy developer's approach to do exercises on Propositional Logic
# -*- coding: utf-8 -*-
from tabulate import tabulate
class Var(object):
def __init__(self, name):
self.name = name
self.value = None
def bind(self, value):
@kachayev
kachayev / go-channels-1-generator.go
Created July 16, 2012 19:42
Channels-driven concurrency with Go
// Channels-driven concurrency with Go
// Code examples from Rob Pike's talk on Google I/O 2012:
// http://www.youtube.com/watch?v=f6kdp27TYZs&feature=youtu.be
//
// Concurrency is the key to designing high performance network services.
// Go's concurrency primitives (goroutines and channels) provide a simple and efficient means
// of expressing concurrent execution. In this talk we see how tricky concurrency
// problems can be solved gracefully with simple Go code.
// (1) Generator: function that returns the channel
@kachayev
kachayev / slimevolley_ppo_selfplay.py
Last active January 25, 2022 17:27
Converting RL loop (PPO, DQN, etc) into a selfplay setting by changing 2 lines
"""
Imagine we have a training loop for an agent. E.g. PPO, or DQN, or whatever.
What is the easiest way to convert this into a selfplay?
To make this happen we want to run 2 identical loop: 1 loop for each agent.
But. There are 2 sync points: `env.reset()`, and each `env.step()`.
Is there a way to avoid duplicating code for setting/loading agent model, creating/updating
buffers, running training loops etc... Yes, it is. By utilizing generators protocol: `yield` and `send`.
@kachayev
kachayev / encoder-rnn-simplified.py
Last active May 3, 2021 21:06
Simplified Encoder implementation for "NLP From Scratch: Translation with a Sequence to Sequence Network and Attention" tutorial
#
# The following is the simplification of EncoderRNN code from
# "NLP From Scratch: Translation with a Sequence to Sequence Network and Attention"
# PyTorch tutorial (link: https://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html)
#
# In fact, `nn.GRU` module can execute loop mechanics over the given input sequence when provided
# with such. In the tutorial, the sentence is presented as a tensor of 1-word sequences hence the
# loop is handled in `train` and `evaluation` "manually". Given the fact the `nn.Embedding` handles
# set of indecies and `nn.GRU` executes loops when given (seq_len, batch_size, elem_dim) shaped
# input, encoder could be constructed as the following: