Skip to content

Instantly share code, notes, and snippets.

View julik's full-sized avatar
💭
🎺

Julik Tarkhanov julik

💭
🎺
View GitHub Profile
@julik
julik / deploy.sh
Created April 30, 2024 08:31
The Deployinator™©
#!/usr/bin/env bash
set -e
# This script will build a Docker image from the current state of the working tree.
# It will tag it with the short Git sha, and then push it to Quay.io under our company
# account. Afterwards the image can be picked up from other hosts to deploy.
export DEPLOY_HOST=<someapp>.<my-domain>
export REV=`git log -1 --pretty=%h`
class StreamingController < ApplicationController
class Writer
def initialize(&blk)
@blk = blk
end
def write(stuff)
@blk.call(stuff)
stuff.bytesize
end
@julik
julik / pecobox.rb
Created February 9, 2024 14:59
A circuit breaker using Pecorino leaky buckets
# frozen_string_literal: true
# Pecobox is a Circuitbox-like class which uses Pecorino for
# measurement error rates. It is less resilient than Circuitbox
# because your error stats are stored in the database, but guess what:
# if your database is down your product is already down, so there is
# nothing to circuit-protect. And having a shared data store for
# the circuits allows us to track across machines and processes, so
# we do not have to have every worker hammer at an already failing
# resource right after start
@julik
julik / localities.sql
Created June 4, 2023 10:23
Versioned table with grouping by updated_at
CREATE TABLE localities (
id uuid DEFAULT gen_random_uuid() PRIMARY KEY,
source_updated_at TIMESTAMP,
name TEXT
);
INSERT INTO localities (name, source_updated_at) VALUES
('London', '2023-04-01'),
('London', '2023-03-15'),
('Paris', '2023-02-11'),
('Canberra', '2022-10-05');
# This offers just the leaky bucket implementation with fill control, but without the timed lock.
# It does not raise any exceptions, it just tracks the state of a leaky bucket in Postgres.
#
# Leak rate is specified directly in tokens per second, instead of specifying the block period.
# The bucket level is stored and returned as a Float which allows for finer-grained measurement,
# but more importantly - makes testing from the outside easier.
#
# Note that this implementation has a peculiar property: the bucket is only "full" once it overflows.
# Due to a leak rate just a few microseconds after that moment the bucket is no longer going to be full
# anymore as it will have leaked some tokens by then. This means that the information about whether a
mft_gap_mm = 96
offset_outer_mm = 70 # (mft_gap_mm / 2)
# Long side needs an odd number of gaps so that the center span of the
# workbench ends up between two rows of holes and never overlaps the holes
1.step(22,2) do |n_gaps_x|
1.upto(10) do |n_gaps_y|
width_mm = (offset_outer_mm * 2) + (mft_gap_mm * n_gaps_x)
height_mm = (offset_outer_mm * 2) + (mft_gap_mm * n_gaps_y)
puts "#{width_mm}x#{height_mm}mm with #{n_gaps_x + 1} holes along and #{n_gaps_y + 1} holes across"
end
@julik
julik / INSTALL.command
Created May 20, 2021 16:01
Python 3 compatible Logik-Matchbook install script
#!/usr/bin/env python
import contextlib as __stickytape_contextlib
@__stickytape_contextlib.contextmanager
def __stickytape_temporary_dir():
import tempfile
import shutil
dir_path = tempfile.mkdtemp()
try:
yield dir_path
PAYLOAD_SIZE = 15 * 1024 * 1024
CHUNK_SIZE = 65 * 1024 # Roughly one socket buffer
class StufferBody
def each
rng = Random.new(123)
whole_chunks, rem = PAYLOAD_SIZE.divmod(CHUNK_SIZE)
whole_chunks.times do
yield(rng.bytes(CHUNK_SIZE))
end
yield(rng.bytes(rem)) if rem > 0
# In a couple of tables we have VARBINARY columns which either encode a packed
# integer of some kind, or a checksum (which is commonly shared as a hexadecimal).
# This module allows rapid definition of such columns and accessors for them. For example,
# imagine you want to store a CRC32 value - which fits into a uint4
#
# class RecordWithChecksum < ActiveRecord::Base
# extend PackedColumn
# packed_value_column :crc32, pattern: 'V'
# end
#
require 'zlib'
# A fairly naive generator of payloads which, having the same size, will produce an identical CRC32 checksum
def produce_bytes_at_iteration(bytebag_size, random_seed, iter)
rng = Random.new(random_seed)
iter.times do
rng.bytes(bytebag_size)
end
rng.bytes(bytebag_size)
end