Skip to content

Instantly share code, notes, and snippets.

View osbm's full-sized avatar
:shipit:
even fake things have their value

osman - オスマン osbm

:shipit:
even fake things have their value
View GitHub Profile
let
budgetIsDefined = value: (builtins.tryEval value).success;
in
import <nixpkgs/nixos> {
configuration =
{ config, lib, ... }:
{
imports = [
{
@matthen
matthen / hello_world.py
Last active November 25, 2024 21:51
Hello world in python, using genetic algorithm
"""Hello world, with a genetic algorithm.
https://twitter.com/matthen2/status/1769368467067621791
"""
import random
import time
from dataclasses import dataclass
from itertools import chain
from typing import Iterable, List
@0atman
0atman / configuration.nix
Last active February 17, 2025 00:56
A rebuild script that commits on a successful build
{
config,
pkgs,
options,
...
}: let
hostname = "oatman-pc"; # to alllow per-machine config
in {
networking.hostName = hostname;
@schacon
schacon / better-git-branch.sh
Created January 13, 2024 18:41
Better Git Branch output
#!/bin/bash
# Colors
RED='\033[0;31m'
GREEN='\033[0;32m'
NO_COLOR='\033[0m'
BLUE='\033[0;34m'
YELLOW='\033[0;33m'
NO_COLOR='\033[0m'
@veekaybee
veekaybee / normcore-llm.md
Last active February 17, 2025 22:00
Normcore LLM Reads

Anti-hype LLM reading list

Goals: Add links that are reasonable and good explanations of how stuff works. No hype and no vendor content if possible. Practical first-hand accounts of models in prod eagerly sought.

Foundational Concepts

Screenshot 2023-12-18 at 10 40 27 PM

Pre-Transformer Models

{ pkgs, ... }:
{
# ...
nix.settings.system-features = [ "expose-cuda" ];
nix.settings.pre-build-hook = pkgs.writers.writePython3 "nix-pre-build.py" { } (builtins.readFile ./nix-pre-build-hook.py);
# ...
}
@osbm
osbm / turkey_earthquake.js
Created February 12, 2023 00:47
This code can run in Google earth engine web editor.
var geometry =
/* color: #ffc82d */
/* shown: false */
ee.Geometry.Polygon(
[[[36.803785364345735, 37.6022824651851],
[36.84292415829105, 37.569634979372736],
[36.8896160528223, 37.548134235962486],
[36.96171383114261, 37.54377889953015],
[36.99741939754886, 37.565280899470345],
[36.97819332333011, 37.58759286852694],
@bashbunni
bashbunni / .zshrc
Created January 4, 2023 16:28
CLI Pomodoro for Linux
# study stream aliases
# Requires https://github.com/caarlos0/timer to be installed. spd-say should ship with your distro
declare -A pomo_options
pomo_options["work"]="45"
pomo_options["break"]="10"
pomodoro () {
if [ -n "$1" -a -n "${pomo_options["$1"]}" ]; then
val=$1
@veekaybee
veekaybee / chatgpt.md
Last active December 24, 2024 20:23
Everything I understand about chatgpt

ChatGPT Resources

Context

ChatGPT appeared like an explosion on all my social media timelines in early December 2022. While I keep up with machine learning as an industry, I wasn't focused so much on this particular corner, and all the screenshots seemed like they came out of nowhere. What was this model? How did the chat prompting work? What was the context of OpenAI doing this work and collecting my prompts for training data?

I decided to do a quick investigation. Here's all the information I've found so far. I'm aggregating and synthesizing it as I go, so it's currently changing pretty frequently.

Model Architecture

#!/usr/bin/env python3
"""
Use Unicode flag emoji homoglyphs to re-encode UTF-8.
To encode, a string is first encoded as UTF-8, then each byte is
broken down into bits, and each bit is encoded as a flag:
The more well-known country flag in each pair represents 0, and the
less well-known one represents 1. Pairs are cycled through.
There are 8 pairs total, which is the most I could find without repeats.