Skip to content

Instantly share code, notes, and snippets.

View rain-1's full-sized avatar
☂️
Umbrella

rain1 rain-1

☂️
Umbrella
View GitHub Profile
@rain-1
rain-1 / ring.md
Created June 26, 2022 18:45
Ring Quotient For Programmers

quick recap on complex numbers

Take a number, square it, the result is non-negative. Because positive * positive = positive and negative * negative is positive. Or $0^2 = 0^2$.

But someone wanted to take square roots of negative numbers, so they did, and called it 'i'. $\sqrt{-1} = i$. A lot of people were frustrated upon learning this "You can't do that!", "How do you know that it doesn't lead to contradictions".

The solution, to put imaginary and complex numbers on a solid foundation is something called a ring quotient. What you do is you start with the ring (meaning number system) of polynomials over the real numbers $R[i]$, which looks like this:

  • $1, 2 3.5, \pi$ etc.
  • $i, i^2, 0.3 + 9.5 i + 23 i^3$ and so on.
@rain-1
rain-1 / a_How is a matrix used to count fish?.md
Last active November 2, 2023 19:58
How is a matrix used to count fish?

This is explaining stuff relevant to AOC 2021 day 6

How is a matrix used to count fish?

First lets do fibonacci numbers because it's smaller (2x2 matrix instead of 9x9) and it's familiar ground.

So you can implement fibs like this:

def fib(n):
@rain-1
rain-1 / dcs.rkt
Last active September 22, 2023 21:13
Dotted Canonical S-expressions - DCSexps
#lang racket
;; printing s-exps as DCS and TDCS, plus examples of what DCS and TDCS look like
(define (dcs l)
(cond ((pair? l)
(begin
(display ".")
(dcs (car l))
(dcs (cdr l))))
@rain-1
rain-1 / boot.S
Created September 30, 2018 10:16
GNU assembly bootloader
.code16
.global _start
_start:
cli
xor %ax, %ax
mov %ax, %ds
mov $msg, %si
cld
loop:
lodsb
@rain-1
rain-1 / Prompt Injection and AutoGPT.md
Last active September 11, 2023 11:12
Prompt Injection and AutoGPT

Does prompt injection matter to AutoGPT?

Executive summary: If you use AutoGPT, you need to be aware of prompt injection. This is a serious problem that can cause your AutoGPT agent to perform unexpected and unwanted tasks. Unfortunately, there isn't a perfect solution to this problem available yet.

Prompt injection can derail agents

If you set up an AutoGPT agent to perform task A, a prompt injection could 'derail' it into performing task B instead. Task B could be anything. Even something unwanted like deleting your personal files or sending all your bitcoins to some crooks wallet.

Docker helps limit the file system access that agents have. Measures like this are extremely useful. It's important to note that the agent can still be derailed.

@rain-1
rain-1 / Modelling an Uncertain World.md
Last active August 19, 2023 06:16
Modelling an Uncertain World

I have included working code examples that can be run throughout, as well as graphs. I hope this helps make this easier to understand in a more hands on way.

The setup

Suppose you know that there are 10 balls in an urn, some are red and some are blue. So there are 11 different possible models for this situation:

  • M0: 0 red, 10 blue
  • M1: 1 red, 9 blue
  • ...
  • M10: 10 red, 0 blue

Google translate

in:

Give a critique that attempts to refute Searle's chinese room. Include something from derrida in your response.

out:

给出一个试图反驳塞尔中文房间的批评。在您的回复中包含德里达的内容。

@rain-1
rain-1 / bard-lstm.md
Last active May 7, 2023 22:09
bard-lstm.md

This is a report on my experience pair programming with Bard on a neural network task that challenged it to its current limits.

Bard now has the ability to program, or put another way Google has removed the gating that blocked it from trying.

All the code in this article is basically 99% produced by Bard. I either prompted it to refactor things or I just tweaked one line or two lines of every 100.

Note: I used gpt-4 a little bit too, for the training part, but this is mostly Bard.

XOR

@rain-1
rain-1 / 0-MNIST.md
Last active March 25, 2023 03:41
MNIST digit classification

MNIST digit recognition

The pytorch (neural network library) examples include a script to try out the training process for MNIST digit recognition data set: https://github.com/pytorch/examples/tree/main/mnist

This builds up a convolutional neural network that takes one of these pictures and processes it down to 10 neurons. The training process uses two sets of labelled data (examples of pictures of digits and which of the 10 possible digits they are): One training set and one testing set. The training set is used to manipulate all of the "weights" inside the neural network by moving in the (very high dimensional) direction of fastest descent, aiming to get the output neurons to produce the intended label given the input picture. The testing set is used as a metric to say how well the neural network is doing.

I ran this, creating mnist_cnn.pt with 99% accuracy on the test data set.

Then I wanted to see if it worked, so I drew images of all 10 digits. There was no way to try this out so I wrote the attach

@rain-1
rain-1 / best fit blog.md
Last active March 6, 2023 01:13
least squares

Introduction

This blog post is about the Linear Least Squares Problem. This method is credited back to Legendre and Gauss, some of my favorite mathematicians. Why are they such inspiring people? Here is a passage from a post that goes into more depth about Gauss's application of least squares:

The 24-year-old Gauss tackled the orbit problem, assuming only Kepler’s three laws of planetary motion, with his newly discovered error distributions and his method of least squares for three months. He spent over 100 hours performing intensive calculations by hand without any mistakes (and without the luxury of today’s computers!). He had to estimate the six parameters of the orbit (as shown in Figure 7) from only 19 data points, subject to random measurement errors. He even invented new techniques such as the Fast Fourier Transform for interpolating trigonometric series, which produced efficient numerical approximations o