Skip to content

Instantly share code, notes, and snippets.

@MostAwesomeDude
Last active April 7, 2023 06:05
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save MostAwesomeDude/76e61e6634571da296ec1fccc6784c19 to your computer and use it in GitHub Desktop.
Save MostAwesomeDude/76e61e6634571da296ec1fccc6784c19 to your computer and use it in GitHub Desktop.
Colored Functions and Monadic Effects

I wanted to make a brief note about colored functions and monadic effects.

Mothers of Monads

A mother of monads is a codensity monad. For category theorists, there is a slogan; "whenever you meet a functor, ask what its codensity monad is."

Codensity monads correspond to types of continuations. One example is given by the basic double-negation translation; instead of values of type X, we have values of type (X → F(R)) → F(R) for some family of types R and some endofunctor F carrying a monadic effect. Given a callback which finalizes X into a chosen R, we may treat the callback as a continuation and invoke it. This is the original mother of all monads, since double-negation translation is almost always available. But there are other types of continuations. If we use single-shot delimited continuations, then the same monadic effects produce different results. Whenever we meet a monadic effect carried by an endofunctor, ask what its continuation is.

Asynchronous Monads

Suppose that we asynchronously perform some action and then invoke a callback. The callback acts as a continuation of the action. This generates a codensity monad which can host effects. If it is possible to wait or block for an asychronous action, then there is a lowering operation from the asynchronous codensity monad to the double-negation codensity monad. The effects are still performed with asynchronous continuations, though. This is the sense in which futures, deferreds, promises, and other asynchronous values are elements of a monad; they are available through callbacks, just like values in any other codensity monad.

But if it is not possible to wait for asynchronous actions, then we should probably prefer embedding the plain double-negation monad into the asynchronous monad. In practice, it seems that the effects hosted in the plain codensity monad can be performed asynchronously; it's not clear to me whether there's an underlying natural transformation which makes this possible.

The Color Analogy

We are now ready to deconstruct the analogy of colored functions. The original observation was that asynchronous actions (in ECMAScript) are not "in" the same codensity monad as plain functions; they must be awaited or otherwise lowered out of the asynchronous context. To each codensity monad, a color is assigned; the original article assigns blue to the plain calling convention, and red to the asynchronous convention.

However, the analogy is incomplete, because monadic effects are confused with their underlying codensity monads; additionally, we should privilege whichever calling convention is used at the top level of the programming interface. This means that, when writing in ECMAScript or other languages with builtin concurrency, asynchronous actions are not colored red but colorless. To restore monadic effects to the analogy, we should tint the colors; let the double-negation monad be blue, and then perhaps a synchronous non-determinism effect is dark blue, while a synchronous error-handling effect is light blue.

Portable Monadic Effects

When viewed through the reconstructed analogy, monad transformers are recharacterized as effects which are portable to different codensity monads; they are tints without a base color. An important lesson for language designers is to take a desired composite effect and break it apart so that its codensity monad can be cleanly separated from other effects.

This can be justified somewhat by noting that codensity monads are pushouts of identity monads along some functor. Take an identity monad and push it along some effect-carrying functor, and the effects get rolled in, creating a codensity monad.

@chrisdone
Copy link

Some concrete examples might ground this, as at the moment I don’t understand what point you’re trying to make. It reads like markov generated pros, at least to me.

@MostAwesomeDude
Copy link
Author

Sure. To compute a codensity monad for a functor, first we show that the functor is right adjoint to some other functor; then we compose them, obtaining a monad. As shown on nLab, this is the recipe for codensity monads.

For example, let's imagine promises as a functor. Given some synchronous/near value, the promise functor creates an asynchronous context which allows callbacks to transform the value but doesn't ever directly produce it. To find its left adjoint, ponder the target category; where the source language eagerly computes with values, the target language instead has an event subsystem which dispatches callbacks as values become available. In short, promises are a functor from classical machines to event loops. Now, if this is right adjoint, then it should "forget" something, and its left adjoint should "freely" equip values with that forgotten thing. Since the left adjoint sends event loops to classical machines, it should probably transform event-driven chains of promises to machines which evaluate those chains and eventually (after being fed enough events) produce values. We "freely" get to evaluate an event-driven machine, and then "forget" that the events are discrete and asynchronous.

Now let's compose these functors. We have a composite functor from event loops to event loops. It takes chains of promises, evaluates those chains to produce values, then wraps those values into promises again. Simplifying, it takes chains of promises and returns promises for the final value in the chain, wrapping any non-promise values with promises. This is our codensity monad, our primal effect, and our source of continuations.

Finally, let's describe a common mistake, so that we don't make it. If we do the composition in the wrong way, then we get the density comonad. In our example, this would be the comonad that takes near eagerly-computed values to themselves, calling an event loop as a subroutine when needed and waiting for it to finish before returning results. This is the experience of a human writing code at an interactive prompt! We put ourselves on the wrong side of the screen. If we want to think like the code, then we need to dualize the construction, going back to the codensity monad instead. (To double-check that we really did get a comonad, note that if we have a value, then we can wait for that value; and also that if we wait for a value, then we can instead first wait and then wait some more for that value.)

Note that if you don't feel that the first section of the note is meaningful, then you might not grok codensity monads. Don't feel bad about this; it has apparently taken me a decade to process it. The rest of the note might well be Markov-generated; it certainly does not feel cogent to me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment