Skip to content

Instantly share code, notes, and snippets.

@MiKom
Forked from veekaybee/normcore-llm.md
Created March 20, 2024 09:59
Show Gist options
  • Save MiKom/a7fe13faee4ca98e34d2ea0f6cf3a9d0 to your computer and use it in GitHub Desktop.
Save MiKom/a7fe13faee4ca98e34d2ea0f6cf3a9d0 to your computer and use it in GitHub Desktop.
Normcore LLM Reads

Anti-hype LLM reading list

Goals: Add links that are reasonable and good explanations of how stuff works. No hype and no vendor content if possible. Practical first-hand accounts of models in prod eagerly sought.

Foundational Concepts

Screenshot 2023-12-18 at 10 40 27 PM

Pre-Transformer Models

Screenshot 2023-12-18 at 8 25 42 PM

Building Blocks

Foundational Deep Learning Papers (in semi-chronological order)

The Transformer Architecture

Screenshot 2023-12-18 at 8 37 44 PM

Attention

GPT

Screenshot 2023-12-18 at 8 37 44 PM

Significant OSS Models

LLMs in 2023

Screenshot 2023-12-18 at 10 07 57 PM

Training Data

Pre-Training

RLHF and DPO

Screenshot 2023-12-18 at 10 07 57 PM

Fine-Tuning and Compression

Small and Local LLMs

Deployment and Production

LLM Inference and K-V Cache

Prompt Engineering and RAG

GPUs

Screenshot 2023-12-18 at 10 02 48 PM

Evaluation

Eval Frameworks

UX

What's Next?

Thanks to everyone who added suggestions on Twitter, Mastodon, and Bluesky.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment