(See discussion on Hacker News)
You can now differentiate (almost1) any differentiable hyperbolic, polynomial, exponential, and/or trigonometric function.
Let's use the polynomial
λ> f x = 2 * x^3 + 3 * x^2 + 4 * x + 2 -- our polynomial
λ> f 10
2342
λ> diff f 10 -- evaluate df/dx with x=10
664.0
λ> 2*3 * 10^2 + 3*2 * 10 + 4 -- verify derivative at 10
664
We can also compose functions:
λ> f x = 2 * x^2 + 3 * x + 5
λ> f2 = tanh . exp . sin . f
λ> f2 0.25
0.5865376368439258
λ> diff f2 0.25
1.6192873
And differentiate high-dimensional functions, such as diff
is doing:
λ> f x y z = 2 * x^2 + 3 * y + sin z -- f: R^3 -> R
λ> f (D 3 1) (D 4 1) (D 5 1) :: Dual Float' -- call `f` with dual numbers, set derivative to 1
D 29.041077 15.283662
λ> f x y z = (2 * x^2, 3 * y + sin z) -- f: R^3 -> R^2
λ> f (D 3 1) (D 4 1) (D 5 1) :: (Dual Float', Dual Float')
(D 18.0 12.0,D 11.041076 3.2836623)
Or get partial derivatives by setting only the sensitivities we want as dual numbers:
λ> f x y z = 2 * x^2 + 3 * y + sin z -- f: R^3 -> R
λ> f (D 3 1) 4 5 :: Dual Float'
D 29.041077 12.0
If you want to learn more about how this works, read the paper by Conal M. Elliott2 or watch the talk, titled "Provably correct, asymptotically efficient, higher-order reverse-mode automatic differentiation" by Simon Peyton Jones himself3, or read their paper4 by the same name.
There's also a package named ad
which implements this in a usable way. This gist is merely to understand the most basic form of it. Additionally, there's Andrej Karpathy's micrograd written in Python.
Footnotes
-
Only the inverse hyperbolic functions aren't yet implemented in the
Floating
instance ↩ -
http://conal.net/papers/beautiful-differentiation/beautiful-differentiation-long.pdf ↩
On line 22, it seems like you should be writing
(negate u)
rather thanscale (-1) u
[in the same way that on line 20, you write(u + v)
rather than(add u v)
].The code works correctly as written because Float' happens to be a type synonym for Float, and thus VectorSpace methods automatically apply to u as well. But the very fact that you bothered making a different name for Float' suggests you want to be preserving some distinction which this is ignoring.
This is no big deal, of course. I'm just mentioning it in passing. Overall, this is great!