Skip to content

Instantly share code, notes, and snippets.

@shawwn
shawwn / glob.cpp
Created June 15, 2023 06:16
A simple glob function that returns a vector of strings for POSIX
#include <glob.h>
#include <vector>
#include <string>
namespace util
{
std::vector<std::string> glob(const std::string& pattern) {
glob_t glob_result = {0}; // zero initialize
@shawwn
shawwn / cmake_test.cmake
Created May 12, 2023 04:46
I was playing around with writing a lisp-to-cmake compiler. https://github.com/shawwn/pymen/tree/cmake
cmake_policy(VERSION "3.25.0")
set(reserved
ALL
"=" ON
"==" ON
"+" ON
"_" ON
"%" ON
"*" ON
"/" ON
You have fallen into Event Horizon with John Michael Gadia.
In today's episode, John is joined by Sean Pracer.
Sean Pracer is an AI researcher and machine learning engineer.
He has contributed to projects such as ThePile, an open source
training data set for large language models.
He currently works on research and development for AGI.
@shawwn
shawwn / llama-dl-dmca.md
Last active April 5, 2023 02:35
I prompted GPT-4 to draft a DMCA counterclaim to Meta's DMCA against llama-dl: https://github.com/github/dmca/blob/master/2023/03/2023-03-21-meta.md

Prompt

Meta has issued a DMCA copyright claim against llama-dl, a GitHub repository, for distributing LLaMA, a 65-billion parameter language model. Here's the full text of the DMCA claim. Based on this, draft a DMCA counterclaim on the basis that neural networks trained on public data are not copyrightable.

--

VIA EMAIL: Notice of Claimed Infringement via Email
URL: http://www.github.com
DATE: 03/20/2023

@shawwn
shawwn / hon_timestamps.txt
Last active March 30, 2023 04:44
Heroes of Newerth file timestamps. Generated with `tree -Dhf`
This file has been truncated, but you can view the full file.
[ 544 Mar 29 20:10] .
├── [ 160 Jan 18 2011] ./Abaddon Share
│   ├── [ 288 Jan 18 2011] ./Abaddon Share/CVS
│   │   ├── [ 3 Jan 18 2011] ./Abaddon Share/CVS/Entries
│   │   ├── [ 0 Jan 18 2011] ./Abaddon Share/CVS/Entries.Extra
│   │   ├── [ 0 Jan 18 2011] ./Abaddon Share/CVS/Entries.Extra.Old
│   │   ├── [ 31 Jan 18 2011] ./Abaddon Share/CVS/Entries.Log
│   │   ├── [ 0 Jan 18 2011] ./Abaddon Share/CVS/Entries.Old
│   │   ├── [ 15 Jan 18 2011] ./Abaddon Share/CVS/Repository
│   │   └── [ 45 Jan 18 2011] ./Abaddon Share/CVS/Root
@shawwn
shawwn / llama.md
Last active June 15, 2024 10:13
A transcript of an interview I did for The Verge on March 6, 2023 about LLaMA, Facebook's new 65 billion parameter language model that was recently leaked to the internet: https://news.ycombinator.com/item?id=35007978

The Verge: "Meta’s powerful AI language model has leaked online — what happens now?"


Could you confirm that you downloaded the LLaMA series from 4chan? Were you able to get it running yourself or did you just repackage the download? (I was a bit confused reading your tweets about that what exactly you'd done there, so if you're able to explain that, it'd be great)

I downloaded it from Facebook, actually. You can find some details here.

Basically, the sequence of events was:

> initializing model parallel with size 8
> initializing ddp with size 1
> initializing pipeline with size 1
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
{"seed": 187073, "temp": 0.7, "top_p": 0.0, "top_k": 40, "repetition_penalty": 1.1764705882352942, "max_seq_len": 2048, "max_gen_len": 2048}
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Loading
> initializing model parallel with size 8
@shawwn
shawwn / example.sh
Created March 6, 2023 05:17
How I run 65B using my fork of llama at https://github.com/shawwn/llama
mp=1; size=7B; # to run 7B
mp=8; size=65B; # to run 65B
for seed in $(randint 1000000)
do
export TARGET_FOLDER=~/ml/data/llama/LLaMA
time python3 -m torch.distributed.run --nproc_per_node $mp example.py --ckpt_dir $TARGET_FOLDER/$size --tokenizer_path $TARGET_FOLDER/tokenizer.model --seed $seed --max_seq_len 2048 --max_gen_len 2048 --count 0 | tee -a ${size}_startrek.txt
done
@shawwn
shawwn / llama_65b_data.txt
Last active March 13, 2023 15:17
(Generated by LLaMA 65B)
I am Lieutenant Commander Data, and I am an android.
I was created by Doctor Soong in the mid-2300s on Earth's moon colony.
My positronic brain is a network of trillions of interconnected
neurons that allow me to experience consciousness and sentience as
only living beings can—and yet my mind operates at speeds far greater
than those of most unenhanced organics. This makes it possible for me
to perform complex analyses almost instantaneously while
simultaneously running thousands of background processes without any
decrease in efficiency or awareness. It also lets me communicate with
@shawwn
shawwn / 65b_samples.txt
Last active May 18, 2023 06:35
Some LLaMA 65B outputs after fixing the sampler settings.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
{"seed": 374894, "temp": 0.7, "top_p": 0.0, "top_k": 40, "repetition_penalty": 1.1764705882352942, "max_seq_len": 512, "max_gen_len": 511}
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Loading
Loaded in 8.72 seconds
============== sample 1 =================
I believe the meaning of life is to grow, learn and give.