Skip to content

Instantly share code, notes, and snippets.

View cbonnett's full-sized avatar

Christopher Bonnett cbonnett

View GitHub Profile

Learning LLMs in 2025

So you know how the transformer works, and you know basic ML/DL, and you want to learn more about LLMs. One way to go is looking into the various "algorithmic" stuff (optimization algorithms, RL, DPO, etc). Lot's of materials on that. But the interesting stuff is (in my opinion at least) not there.

This is an attempt to collect a list of academic (or academic-like) materials that explore LLMs from other directions, and focus on the non-ML-algorithmic aspects.

Courses

  • David Chiang's Theory of Neural Networks course.
  • This is not primarily LLMs, but does have substantial section on Transformers. Formal/Theory. More of a book than a course.
@veekaybee
veekaybee / normcore-llm.md
Last active October 22, 2025 08:37
Normcore LLM Reads

Anti-hype LLM reading list

Goals: Add links that are reasonable and good explanations of how stuff works. No hype and no vendor content if possible. Practical first-hand accounts of models in prod eagerly sought.

Foundational Concepts

Screenshot 2023-12-18 at 10 40 27 PM

Pre-Transformer Models

@AustinRochford
AustinRochford / pydata_dc_2016_vi_in_python.ipynb
Last active June 17, 2024 07:12
PyData DC 2016 Variational Inference in Python
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@AustinRochford
AustinRochford / GP.iypnb
Created September 25, 2016 02:28
Bayesian GP PyMC3
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
@fperez
fperez / SimpleNeuralNets.ipynb
Last active April 19, 2022 18:52
Notes for "Why does deep and cheap learning work so well?" (ArXiv:1608.08225v1/cond-mat.dis-nn) by Lin and Tegmark.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
import matplotlib.pyplot as plt
import numpy as np
import seaborn
from keras.layers import Input, Dense, merge, ELU, Dropout
from keras.models import Model
from keras.regularizers import l2
from keras import backend as K
from keras.optimizers import rmsprop, adam
@danijar
danijar / blog_tensorflow_scope_decorator.py
Last active January 17, 2023 01:58
TensorFlow Scope Decorator
# Working example for my blog post at:
# https://danijar.github.io/structuring-your-tensorflow-models
import functools
import tensorflow as tf
from tensorflow.examples.tutorials.mnist import input_data
def doublewrap(function):
"""
A decorator decorator, allowing to use the decorator to be used without
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.