Skip to content

Instantly share code, notes, and snippets.

Last active March 14, 2023 04:09
What would you like to do?
The Y Combinator explained in a runnable Scheme file
;;; The Y Combinator explained in scheme.
;;; with credits to:
;;; Status: WIP
(define fibonacci
(lambda (n)
(cond ((= n 0) 0)
((= n 1) 1)
(else (+ (fibonacci (- n 1)) (fibonacci (- n 2)))))))
(define almost-fibonacci
(lambda (f)
(lambda (n)
(cond ((= n 0) 0)
((= n 1) 1)
(else (+ (f (- n 1)) (f (- n 2))))))))
;; Recursive definition of factorial.
(define factorial
(lambda (n)
(if (= n 0)
(* n (factorial (- n 1))))))
;; But, imagine we cannot recur.
;; Say, scheme or another implementation language
;; doesn't support explicit recursion by having the
;; fn called from itself like the above.
;; e.g., lambda calculus.
;; What if we abstract out the recursion?
(define factorialB
((lambda (f)
(lambda (n)
(if (= n 0)
(* n (f (- n 1))))))))
;; Let's call this almost-factorial because it can
;; almost do the job!
(define almost-factorial
(lambda (f)
(lambda (n)
(if (= n 0)
(* n (f (- n 1)))))))
;; This is how we roll in functional programming and math.
;; If you don't know what to put somewhere, just wrap
;; that joint up in a "lol" ("let over lambda").
;; That way, you can pass it any function while you're
;; trying to figure it out, or you can save figuring it
;; out for someone else (or for the same someone,
;; you, sometime later) ;).
;; Now, what fn could we pass to almost-factorial to
;; get a reasonable answer for, say, 0?
;; How about id :: a -> a; id x = x
;; Basically, a type polymorphic identity fn. But,
;; scheme is dynamic, so it's automagically polywhatever.
;; Like so:
(define identity (lambda (x) x))
;; So, we'll use that useless looking function as the fn
;; that we pass to a new fn that can only work for
;; 0.
(define factorial0 (almost-factorial identity))
;; That works for 0! Now we've built up on our
;; abstraction that left out parts we didn't know.
;; factorial0 can compute some, but not all factorials.
;; Specifically, it can compute the factorials up to
;; and including the factorial of zero.
;; It won't work for n > 0. For instance, if n = 1
;; then we'll end up with:
;; ==> (* 1 (identity 0))
;; ==> (* 1 0)
;; ==> 0
;; which is not the correct answer.
;; Now, consider this next version of factorial0:
(define factorial1
(almost-factorial factorial0))
;; Which is the same thing as:
(define sameasfactorial1
(almost-factorial identity)))
;; This will correctly compute the factorials of 0 and 1,
;; but it will be incorrect for any n > 1.
;; (factorial1 1) ==> 1
;; (factorial1 0) ==> 1
;; (factorial1 2) ==> 0 ; incorrect :(
;;==> ((almost-factorial factorial0) 1)
;;==> (((lambda (f)
;;==> (lambda (n)
;;==> (if (= n 0)
;;==> 1
;;==> (* n (f (- n 1)))))) factorial0) 1)
;;==> ((lambda (n)
;;==> (if (= n 0)
;;==> 1
;;==> (* n (factorial0 (- n 1))))) 1)
;;==> (if (= 1 0)
;;==> 1
;;==> (* 1 (factorial 0 (- 1 1))))
;;==> (if #f
;;==> 1
;;==> (* 1 (factorial0 (- 1 1))))
;;==> (* 1 (factorial0 (- 1 1)))
;;==> (* 1 (factorial0 0))
;;==> (* 1 1)
;;==> 1
;; So, factorial0 can compute factorial for n = 0.
;; factorial1 can compute factorial for n = 0 and n = 1.
;; factorial0 will be incorrect for n > 0.
;; factorial1 will be incorrect for n > 1.
;; We could actually throw away factorial0
;; and just use factorial1 because it can do both!
;; Except for the fact that factorial1 is defined
;; in terms of factorial0.
;; But!, factorial0 is just
(define factorial0' (almost-factorial identity))
;; So, factorial1 is really just:
(define factorial1'
(almost-factorial identity)))
;; We can keep going, and define functions which can
;; compute factorials up to any particular limit...
(define factorial2 (almost-factorial factorial1))
(define factorial3 (almost-factorial factorial2))
(define factorial4 (almost-factorial factorial3))
(define factorial5 (almost-factorial factorial4))
;; And, like factorial1, these just nest almost-factorial.
;; There is type theory here, too. There, we'd say
;; that almost-factoiral is a data constructor. That
;; might not help so much right now, though. I'll
;; add discussion about type theory at the end for those
;; who might be interested.
;; Does this remind anyone else of inductive proofs?
;; ;)
;; Let's see a few more examples of our factorialNs
;; in "normal form" -- not defined in terms of the
;; factorialN-1 fn that came before.
(define factorial2'
almost-factorial identity))
(define factorial3'
almost-factorial identity))
(define factoiral4'
(almost-factorial identity))))))
;; Seems like every factorialN' has N+1
;; almost-factorials tucked under it, eh?
;; So, if this were more like math, you might wonder if
;; you could just define this:
;;(define factorial-infinity
;; (almost-factorial
;; (almost-factorial
;; (almost-factorial
;; ...))))
;; We can't write it out like that in scheme.
;; However, we will do the equivalent of this.
;; factorial-infinity is basically the exact fn that
;; we want for factorial. It works on all integers
;; greater than or equal to 0. In fact, this might
;; work as a generator or as a definition of a
;; factorial list builder in a lazy language such
;; as Haskell.
;; What we have shown is that we could define an
;; infinite chain of almost-factorials that would
;; give us the factorial function.
;; Another way of saying this is that the factorial
;; function is the fixpoint of almost-factorial.
;; Paraphrased from wikipedia's entry on
;; ["Fixed point (mathematics)"][fixedpoint]:
;; In math, a fixed point, or fixpoint of a function
;; is an element in the function's domain that is
;; mapped to itself by the same function.
;; So, the fixpoint for the function f(x) would be
;; c if f(c) = c. That also means that:
;; f(f(...f(c)...)) = f^n(c) = c.
;; When recursively computing f, c is an important
;; terminating consideration.
;; Basically, if f(c) is anywhere in a nested composition
;; of f(),
;; f(f(c))
;; Then, you'll get c. Makes sense, right?
;; Another example:
;; f(x) = x^2 - 3x + 4
;; f(2) = 2
;; So, c = 2 here. 2 is a fixpoint of f.
(define alg-fixpoint-example
(lambda (x)
(+ (- (expt x 2) (* 3 2)) 4)))
;; (alg-fixpoint-example 2)
;; ==> 2
;; Aaaand:
;; (alg-fixpoint-example (alg-fixpoint-example 2))
;; ==> 2
;; Another example is cos(x) = x. If you take a calculator
;; and start with 0 and just keep hitting the cos key,
;; you'll eventually converge on a number that won't change
;; no matter how many times you hit the cos key because
;; cos(x) = x. That's the fixpoint of the cos() function.
;; Perhaps you've heard the Y combinator referred to as
;; the "fixpoint combinator" before. No? Well, that's
;; another name for the Y combinator.
;; So, if the fixpoint of a function f is
;; c = f(c), then:
;; f :: a -> a
;; The fixpoint function takes the same type. For the
;; examples above, that type was Real Numbers.
;; But, this is a universal property of all functions!
;; As long as the function that generates the fixpoint
;; can take the same type of "thing" as input as it
;; produces as output, then that "thing" can be anything.
;; And, if you're doing higher-order functions, why
;; shouldn't it be the case that fixpoints could be
;; functions themselves?
;; If you have a higher-order function like
;; almost-factorial that takes in a function as its input
;; and produces a function as its output, then it
;; should be possible to compute its fixpoint,
;; which is the same type as its input and output.
;; And, since we're at a higher order, the input and output
;; are functions! Namely, functions that take an integer
;; as input and produce an integer as output.
;; For cos and for factorial, the input would be a
;; function that takes a single integer argument as
;; input and produces a single integer as output.
;; In fancy talk, or haskell, our higher-order
;; function signature looks like this:
;; myfp :: (Int -> Int) -> (Int -> Int)
;; and cos would look something like this:
;; cos :: Real -> Real
;; And, a usesless parametrically polymorphic fn that
;; would describe this for any type would look like:
;; fromAtoA :: a -> a
;; (That happens to be the signature for identity...)
;; So, much like the fixpoint for f(x) is c where
;; c = f(c), the fixpoint for (almost-factorial f) is
;; c = (almost-factorial c) or:
;; factorial = (almost-factorial factorial)
;; Amazing.
;; To say it in even another way, but the simplest,
;; closest-to-the-language-of-the-mind-kinda-way,
;; the closest-to-math way:
;; fixpoint-function = (almost-factorial
;; fixpoint-function)
;; But, that's a recursive definition, and we can
;; understand it, but in the world of our computer languge,
;; for the sake of reaching Y,
;; Recursion is not allowed!
;; So, that's understandable, but knowing that factorial
;; is the fixpoint of almost-factorial doesn't tell us
;; how to compute it.
;; But, wouldn't it be nice if there was some
;; magical higher-order function that would take as its
;; input a function like almost-factorial, and would
;; output its fixpoint function, which in this case would
;; be factorial? Wouldn't that be boss?
;; Well, that higher-order fixpoint function-finding
;; function exists, and it is called The Y Combinator!
;; Also known as the "fixpoint combinator:" it takes in
;; a function and returns the fixpoint of that function.
;; Oooo Ahhhh.
;; You should now basically understand what Y does.
;; You don't need to read the rest of this unless
;; you're interested in how to make Y.
;; Still here, huh? Alright, so, let's get Y.
;; We'll start with the lazy / recursive version
;; because it's the easiest to understand.
;; When you want to write a function, it's good
;; to just start with, "What does the function do?"
;; Here's what Y does:
;; (Y f) = fixpoint-of-f
;; Because, as we said above, we want a magic fn, Y,
;; that can tell us the fixpoint of any fn f.
;; Okay, what do we know about the fixpoint of f?
;; Well, we know that
;; (f fixpoint-of-f) = fixpoint-of-f
;; In other words, we know that applying our function,
;; f, to the fixpoint-of-f will give us back the
;; fixpoint-of-f. e.g., f(c) = c, or cos(x) = x.
;; Remember?
;; And, we know that we want (Y f) to give us the
;; fixpoint of f:
;; (Y f) = fixpoint-of-f
;; And, again, we know that the fixpoint-of-f is the same
;; as (f fixpoint-of-f), so we can say:
;; (Y f) = (f fixpoint-of-f)
;; or
;; (Y f) = fixpoint-of-f = (f fixpoint-of-f)
;; or
;; (Y f) = (f (Y f))
;; Whoa, nelly. We just defined Y in terms of itself.
;; Doesn't it look like what we were doing above with
;; ol' factorial = (almost-factorial factorial) and
;; fixpoint-function = (almost-factorial
;; fixpoint-function)?
;; Remember, almost-factorial's fixpoint function is
;; factorial!
;; In lazy Scheme, like [racket][] with #lang lazy, this
;; will actually work!
;; (define Y
;; (lambda (f)
;; (f (Y f))))
;; (define almost-factorial
;; (lambda (f)
;; (lambda (n)
;; (if (= n 0)
;; 1
;; (* n (f (- n 1)))))))
;; (define factorial (Y almost-factorial))
;; [racket]:
;; 💥 amirite?
;; NOTE: I have commented out the lazy defs because
;; they will crash your non-lazy scheme ;).
;; But, this isn't what we're really after.
;; * This will only work in a lazy language,
;; * it is (explicitly) recursive,
;; * and it's not a combinator.
;; Y in the body is a free variable.
;; It isn't bound to anything. So, it's not really
;; "The Y Combinator" Psych! sry. At least now your
;; factorial function isn't recursive, though!
;; "Refactoring", right? Move the problem stuff
;; out to somewhere else and let some other jerk
;; deal with it, even if that other jerk happens to
;; be future you.
;; (define future-you (Y almost-you))
;; just kidding. We're not going to do that as an
;; example.
;; Seriously, though, if you were building a language,
;; you would only need to make one function explicitly
;; recursive, and that would be your Y. It's just not
;; technically a combinator.
;; "A combinator is a lambda expression with no free
;; variables."
;; So,
(define not-comby
(lambda (x)
;; is not a combinator because y isn't bound to anything.
;; It's free. Like if you weren't bound to a 9-to-5.
;; This is sometimes called lexical scoping.
;; Scheme will let you define that because it doesn't
;; try to tell you what to do all the time like your boss
;; does. And, for all scheme knows, you could plop that
;; definition into a lexicon that had "y" defined before,
;; like if you did:
(define y "daylight")
;; Then, (not-comby 1) would evaluate to "daylight"
;; Don't worry. We're not done.
;; This isn't a cop-out where we're lying and just moving
;; the recursion around. It's just easier to present the
;; recursive definition, or the definition with the
;; free Y first. It's a process.
;; The reason that the recrusive definition won't work,
;; even in scheme which does support recursion is due to
;; the order of evaluation. We call lazy racket and
;; Haskell lazy because they don't evaluate anything
;; until and unless they need to. Most languages,
;; including scheme, use a "strict" evaluation model.
;; That means that all of the arguments to a function
;; are evaluated before they're passed to the function.
;; So,
(not-comby (expt 2 2))
;; will run (expt 2 2) first, then apply not-comby to
;; 4 (which would throw it away and just return "daylight"
;; since we used it to show lexical scoping earlier).
;; If you load this file, you can run it to see for
;; yourself.
;; Our lazy definiton of Y will crash some scheme
;; interpreters, like the BiwaScheme javascript
;; one that I'm writing this in in, because
;; it will try to evaluate (Y f) like this:
;; (Y f)
;; = (f (Y f))
;; = (f (f (Y f)))
;; = (f (f (f (Y f))))
;; forever (and ever (and ever( and ever)))
;; The evaluation of (Y f) will never terminate, so
;; it's not a useful function with strict evaluation.
;; But wait!
;; When you define a lambda, it doesn't run it, right?
;; So, tricky lambda calculus and tricky scheme must
;; have some way to delay evaluation. Hmmm.
;; Lambda the Ultimate to the rescue!
;; Any language with anonymous functions delays the
;; evaluation of the function. That's one reason I've
;; been defining functions in long-hand, like:
(define verbose-definition
(lambda (x)
"dude, you are long-winded."))
;; Instead of using the scheme shortcut
(define (short-define-version x)
"it's still too much.")
;; I wanted it to be explicit that "define" is
;; really just naming a lambda. That lambda isn't
;; evaluated until you call it.
(verbose-definition 1)
;; ==> "dude, you are long-winded."
;; You can define a lambda and call it at the same time:
((lambda (x) "Get to the point!") 1)
;; Putting a function at the head of a list is
;; part of scheme's evaluation model. Whenever it sees
;; a list with (f ...) it will tray to evaluate it,
;; and since it's strict it will apply it/evaluate it/
;; collapse it before evaluating the outer fn.
(verbose-definition ((lambda (x) 1) 1))
;; ==> "dude, you are long-winded."
(verbose-definition (short-define-version 1))
;; ==> "dude, you are long-winded."
;; So, how might we delay the evaluation of (Y f)?
;; Lambda the Ultimate to the rescue!
;; (Y f) is going to become a function of one argument.
;; So this is a legal equality:
;; (Y f) = (lambda (x) ((Y f) x))
;; Whatever one-argument function (Y f)-is,
;; (lambda (x) ((Y f) x)) has to be the same function, but
;; it's delayed. It's only after you call it at the head
;; of a list and pass an argument to it when it will be
;; evaluated. So, when you call it, it will bind your
;; argument to 'x', and then call ((Y f) x), almost as
;; if you called ((Y f) x) by itself. Not almost, exactly
;; as if you called ((Y f) x) by itself, so you're
;; delaying the application until you give it another x.
;; In a similar way, this would be true:
;; cos = (lambda (x) (cos x))
(define mycos
(lambda (x)
(cos x)))
;; Now you have a a (cos x) that you can run later with
;; (mycos 1), and scheme will pass the 1 to mycos and then
;; evaluate (cos 1).
;; TODO: Should I show nested lambdas to show that
;; you can nest the delay? Like return a lambda that
;; waits for another argument before finally running
;; the inner function?
;; So, here is Y defined in standard scheme:
(define Y
(lambda (f)
(f (lambda (x) ((Y f) x)))))
;; And, here I didn't comment it because it shouldn't
;; infinitely regress and break your interpereter session.
;; And you could define factorial now!
(define factorial-y
(Y almost-factorial))
;; Expanding out the call to Y, we have:
(define factorial-expanded-y1
((lambda (f)
(f (lambda (x)
((Y f) x))))
(define factorial-expanded-y2
(lambda (x)
((Y almost-factorial) x))))
(define factorial-expanded-y3
(lambda (n)
(if (= n 0)
(* n ((lambda (x)
((Y almost-factorial) x)) (- n 1))))))
;; Here, I'm doing the work of the interpreter in
;; several steps. I'm showing what the defines, etc.
;; will be replaced with. I named them differently to
;; show that they're the same and each one *should*
;; work the same way. So, you can run:
;; (factorial-expanded-y3 0)
;; ==> 1
;; (factorial-expanded-y1 4)
;; ==> 24
;; etc.
;; Now, with the power of lambda, this version of Y
;; will work even in a strict language. And, the version
;; of factorial will work in a strict langauge. Hell,
;; we can even define that ol' fibonacci with our new
;; Y!
(define fibonacci-y
(Y almost-fibonacci))
;; Just like the original tutorial says, this part was
;; non-trival, so please don't be discouraged if you
;; don't get it right away. Just sleep on it, play with
;; it in your mind, and now you can play with it in
;; an interpreter easily since I went to the trouble
;; of making this file!
;; But, maybe you are a little disappointed. There's
;; still a little fib in this definition.
;; This is a working Y, but something is weird...
;; Ya got me! It's still not a combinator!
;; Since, we really just hacked the lazy definition of
;; Y by using lambda to delay its evaluation,
;; our Y is still defined in terms of Y.
;; Y refers to itself, so our definition is
;; explicitly recursive. By definition, it is
;; not a combinator. A combinator isn't allowed to
;; be explicitly recursive; it has to be a lambda
;; expression with no free variables.
;; (define Y
;; (lambda (f)
;; (f (Y f))))
;; That just ain't gonna cut it.
;; Another way to think about this is that you should
;; be able to replace the name of a combinator with
;; its definition everywhere that it's found and have
;; everything still work. If you tried to do that, say,
;; with string replacement, with our definition of Y,
;; you would hit an infinite loop again!
;; Whatever our final definition of the Y combinator will
;; be, it will not be explicitly recursive. From this
;; non-recursive function, we will then be able to define
;; whatever recursive functions we want without actually
;; implementint recursion. That's the ticket we were after,
;; right?
;; We want to define a version of factorial without
;; explicit recursion. One way to do that is to pass
;; some version of the factorial function itself as an
;; extra argument when you call the function.
;; This won't work yet:
(define not-quite-part-factorial
(lambda (self n)
(if (= n 0)
(* n (self (- n 1))))))
;; Note that this not-quite-part-factorial is not the
;; same as the almost-factorial I described previously.
;; We'd have to call this in a different way to compute
;; factorials:
;; (not-quite-part-factorial
;; not-quite-part-factorial 5)
;; This is not explicitly recursive because we are sending
;; in a copy of the not-quite-part-factorial function as
;; the self argument. However, it won't work unless the
;; point of "recursion" calls the function in the exact
;; same way:
(define almost-part-factorial
(lambda (self n)
(if (= n 0)
(* n (self self (- n 1)))))) ;; note the extra self
;; Now, this will work:
;; (almost-part-factorial almost-part-factorial 5)
;; ==> 120
;; But, that's a weird way to call it. Yuck.
;; Lambda the Ultimate to the rescue!
;; Let's abstract our self into a nested lambda that
;; returns a function that takes 'n' so that it's
;; more like calling our original factorial function.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment