Skip to content

Instantly share code, notes, and snippets.

View Nikolaj-K's full-sized avatar
💭
I'm combinating Why's.

Nikolaj Kuntner Nikolaj-K

💭
I'm combinating Why's.
View GitHub Profile
@Nikolaj-K
Nikolaj-K / nidlatam.txt
Created July 14, 2024 12:05
What is this?
This file has been truncated, but you can view the full file.
Mappings out of V_6
input cardinality = 0
(#1) [] ↦ []
input cardinality = 1
(#2) [[]] ↦ [[[]]]
(#3) [[[]]] ↦ [[]]
@Nikolaj-K
Nikolaj-K / smol_diffusion.md
Created July 6, 2024 18:49
Pitching the smalldiffusion library and paper

Links of the pages in the video:

https://youtu.be/Q_c0n1d5x3I

########## ########## ########## ##########

  • Paper:

Interpreting and Improving Diffusion Models from an Optimization Perspective >

@Nikolaj-K
Nikolaj-K / requirements.txt
Last active June 29, 2024 11:57
Fairly minimal requirements.txt for the smalldiffusion package
## Install hints for
## https://github.com/yuanchenyang/smalldiffusion/
## as of July 2024. Produced via `pip freeze > requirements.txt`
## You may `pip install -r requirements.txt` this file for a numpy that worked for me.
## Small discussion at
## https://github.com/yuanchenyang/smalldiffusion/issues/1
# accelerate==0.31.0
# appnope==0.1.4
# asttokens==2.4.1
@Nikolaj-K
Nikolaj-K / softmax_derivative.tex
Last active June 2, 2024 17:04
SoftMax: On derivations of its derivations, ∂σ/∂x
Scirpt used in the video:
https://youtu.be/yx2xc9oHvkY
This video was a reaction to derivations such as:
re: https://community.deeplearning.ai/t/calculating-gradient-of-softmax-function/1897/3
----
For general $s\colon{\mathbb R}\to{\mathbb R}$, define the scaled vector ${\vec x}^s$: $i\mapsto \dfrac{s(x_i)}{\sum_{k=1}^n s(x_k)}$
@Nikolaj-K
Nikolaj-K / diffusion_geometry.py
Last active May 29, 2024 17:22
Diffusion Geometry
Video discussing pointing to text 'Diffusion Geometry' (May 2024, 49 pages)
by Iolo Jones (Durham University)
https://www.youtube.com/watch?v=f2GJG7vMSZI
#### Links
* Paper:
https://arxiv.org/abs/2405.10858
https://arxiv.org/pdf/2405.10858
@Nikolaj-K
Nikolaj-K / Consequentia_Mirabilis.txt
Last active April 13, 2024 11:24
Pierce's law, Consequentia Mirabilis, ¬¬-Elimination and LEM without Explosion
Proofs discussed in the video:
https://youtu.be/h1Ikhh3J1vY
Legend and conventions:
$(P \to \bot) \lor P\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,$ ... $PEM$ ... Principle of excluded middle (a.k.a. $LEM$)
$((P \to \bot) \land P) \to Q\,\,\,$ ... $EXPL$ ... principle of explosion, a.k.a. ex falso
$((P \to Q) \to P) \to P$ ... $PP$ ... Pierce's principle
$((P \to \bot) \to P) \to P$ ... $CM$ ... consequentia mirabilis, a.k.a. Clavius's principle
@Nikolaj-K
Nikolaj-K / subcountability.tex
Last active April 8, 2024 00:04
ℕ surjects onto ℝ. Subsets of ℕ surject onto ℕ^ℕ. ℕ^ℕ injects into ℕ. All that.
Shownotes to the video:
https://youtu.be/q-mjO9Uxvy0
For a related and relevant video constructive logic basics and upshots is at
https://youtu.be/-lPrjPHElik
For a related and relevant discussion on computably enumerable sets and their compliments, see this 4yo video
https://youtu.be/Ox0tD58DTG0
For some of the non-theorems in the list it helps to understand \Pi_0^2-complete sets.
The references video on how the Axiom of Choice and Regularity each imply LEM is at
https://youtu.be/2EOW23uVcRA
@Nikolaj-K
Nikolaj-K / manual_sampling.py
Created March 12, 2024 22:44
Manually sampling from any given 1D histogram
"""
Script discussed in the video:
https://youtu.be/ndAmT8CYGDM
Links:
* https://en.wikipedia.org/wiki/Energy-based_model
* https://en.wikipedia.org/wiki/Brownian_motion
* https://www.extropic.ai/future
"""
@Nikolaj-K
Nikolaj-K / run_asyncio_example.py
Created January 28, 2024 22:00
Basic async io example
"""
Script discussed in the video
https://youtu.be/RltoGfNhex4
asyncio Enables (interleaved) sequences of functions execution within one thread.
Makes e.g. sense if you want less intensive computation results earlier (for IO) or
If some awaited computation is done in some external thread (e.g. info from a server).
"""
import asyncio
@Nikolaj-K
Nikolaj-K / ReLU_square.py
Last active January 16, 2024 05:00
Computing all polynomials via just ReLU activations (no learning needed)
"""
Script discussed in this videos:
* Video 1:
https://youtu.be/PApGm1TKFHQ
* Video 2:
https://youtu.be/1z2GRcaKSrA
==== Text for video 1 ====