Skip to content

Instantly share code, notes, and snippets.

View breakds's full-sized avatar

Break Yang breakds

View GitHub Profile
@veekaybee
veekaybee / normcore-llm.md
Last active May 6, 2024 16:10
Normcore LLM Reads

Anti-hype LLM reading list

Goals: Add links that are reasonable and good explanations of how stuff works. No hype and no vendor content if possible. Practical first-hand accounts of models in prod eagerly sought.

Foundational Concepts

Screenshot 2023-12-18 at 10 40 27 PM

Pre-Transformer Models

Reinforcement Learning for Language Models

Yoav Goldberg, April 2023.

Why RL?

With the release of the ChatGPT model and followup large language models (LLMs), there was a lot of discussion of the importance of "RLHF training", that is, "reinforcement learning from human feedback". I was puzzled for a while as to why RL (Reinforcement Learning) is better than learning from demonstrations (a.k.a supervised learning) for training language models. Shouldn't learning from demonstrations (or, in language model terminology "instruction fine tuning", learning to immitate human written answers) be sufficient? I came up with a theoretical argument that was somewhat convincing. But I came to realize there is an additional argumment which not only supports the case of RL training, but also requires it, in particular for models like ChatGPT. This additional argument is spelled out in (the first half of) a talk by John Schulman from OpenAI. This post pretty much

@rain-1
rain-1 / LLM.md
Last active May 7, 2024 13:50
LLM Introduction: Learn Language Models

Purpose

Bootstrap knowledge of LLMs ASAP. With a bias/focus to GPT.

Avoid being a link dump. Try to provide only valuable well tuned information.

Prelude

Neural network links before starting with transformers.

@yoavg
yoavg / LLMs.md
Last active February 17, 2024 18:39

Some remarks on Large Language Models

Yoav Goldberg, January 2023

Audience: I assume you heard of chatGPT, maybe played with it a little, and was imressed by it (or tried very hard not to be). And that you also heard that it is "a large language model". And maybe that it "solved natural language understanding". Here is a short personal perspective of my thoughts of this (and similar) models, and where we stand with respect to language understanding.

Intro

Around 2014-2017, right within the rise of neural-network based methods for NLP, I was giving a semi-academic-semi-popsci lecture, revolving around the story that achieving perfect language modeling is equivalent to being as intelligent as a human. Somewhere around the same time I was also asked in an academic panel "what would you do if you were given infinite compute and no need to worry about labour costs" to which I cockily responded "I would train a really huge language model, just to show that it doesn't solve everything!". We

@navjack
navjack / 1gpt-j 8bit readme.md
Last active June 10, 2023 14:10
Local GPT-J 8-Bit on WSL 2

Local GPT-J 8-Bit on WSL 2

This should would on GPUs with as little as 8GB of ram but in practice I've seen usage go up to 9-10GB

I have only personally tested this to be functional in WSL 2 and Windows 11's latest Dev preview build. Attempts to run natively in Windows didn't work but I won't stop you from trying.

I have personally backed up any possibly one day could be lost to time remote files. I could provide those if needed.

Now, why is this neat? Why is this cool?

@angerman
angerman / nixos-on-rockpi4.org
Created May 28, 2020 13:53
NixOS on RockPi4

Installing NixOS on the RockPi4 (B)

The general plan is to build an sd-image-aarch64 from nixpkgs/nixos/modules/installer/cd-dvd/sd-image-aarch64.nix flash it to the eMMC and have the system come up, similar to how this “just works” for raspberry-pis.

The RockPi 4 is a RockChip RK3399 based board, build by radxa with the same formfactor as a rasperry Pi. One noticable difference is that the Rock Pi’s cpu is at the bottom to better allow for the

@lisawolderiksen
lisawolderiksen / git-commit-template.md
Last active April 22, 2024 13:01
Use a Git commit message template to write better commit messages

Using Git Commit Message Templates to Write Better Commit Messages

The always enthusiastic and knowledgeable mr. @jasaltvik shared with our team an article on writing (good) Git commit messages: How to Write a Git Commit Message. This excellent article explains why good Git commit messages are important, and explains what constitutes a good commit message. I wholeheartedly agree with what @cbeams writes in his article. (Have you read it yet? If not, go read it now. I'll wait.) It's sensible stuff. So I decided to start following the

@CarlosDomingues
CarlosDomingues / python-poetry-cheatsheet.md
Last active May 8, 2024 01:33
Python Poetry Cheatsheet

Create a new project

poetry new <project-name>

Add a new lib

poetry add <library>

Remove a lib

@WesThorburn
WesThorburn / 1.Instructions.md
Last active March 14, 2024 22:11
Linux: Compile C++ to WebAssembly and JavaScript using Emscripten and CMake

Linux: Compile C++ to WebAssembly and JavaScript using Emscripten and CMake

Download and Install Emscripten

  • My preferred installation location is /home/user
  • Get the latest sdk: git clone https://github.com/emscripten-core/emsdk.git
  • Enter the cloned directory: cd emsdk
  • Checkout main: git checkout main
  • Install the lastest sdk tools: ./emsdk install latest
  • Activate the latest sdk tools: ./emsdk activate latest
  • Activate path variables: source ./emsdk_env.sh
@danbst
danbst / iphone.nix
Created September 8, 2018 20:05
iPhone pairing for NixOS
# First add this module to your /etc/nixos/configuration.nix
# ...
# imports = [ /path/to/iphone.nix ];
# iphone.enable = true;
# iphone.user = "yourusername";
# ...
# Then rebuild system. Attach iPhone via cable, open terminal and run command `iphone`
# It will fail, but there will occure a dialog on your iPhone to "trust this computer"
# Press OK there and run `iphone` again. If it succeeds it will open a freshly mounted folder