Skip to content

Instantly share code, notes, and snippets.

@silphendio
silphendio / lexer.py
Created April 24, 2024 20:45
lexer for my own programming language
from dataclasses import dataclass
import re
ObjType = int
class Primitives:
[STRING, CHAR, INT, FLOAT, SYMBOL] = range(1, 6)
@dataclass
class Object:
type: ObjType
@silphendio
silphendio / math_parser.py
Created February 2, 2024 04:26
a simple math expression parser
# This is a simple math expression parser
# it solves expressions the way we learned it in math class
# and optionally prints the in-between steps
#
# malformed input produces either ValueError or IndexError
# supported are numbers (int, float, complex), parentheses, and operators (+ , - , / , * and ^)
# ^ is used for exponents, because ** would be a pain to implement
#
# licensed under Zero-Clause BSD (https://opensource.org/license/0bsd/)
# - silphendio
@silphendio
silphendio / exl_slice_test.py
Last active February 15, 2024 23:55
Create LLM slices at runtime with exllamav2
# to use this, first install python and exllamav2 (https://github.com/turboderp/exllamav2)
# load a model, rearrange the layers as you like, set generation parameters, and run it
# duplicate layers share tensors, but still need extra memory for the cache
# thanks to @dnhkng for showing that the cache needs to be re-created
# licensed under WTFPL (http://www.wtfpl.net/about/) - Silphendio
from exllamav2 import *
from exllamav2.generator import *
import sys, torch
@silphendio
silphendio / frankenmerge-test.py
Last active April 18, 2024 06:40
Cutting up a llama and putting it back together
# A simple script to demonstrate the sclicing and recombination of models at runtime
# inspired by mergekit
# Sadly, it doesn't work with quantisized models.
#
# public domain - silphendio
from transformers import AutoTokenizer, AutoModelForCausalLM, TextStreamer
import torch
model_path = 'gpt2' # huggingface name or local folder