You can now differentiate (almost1) any differentiable hyperbolic, polynomial, exponential, and/or trigonometric function that only takes one input (for now).
Let's use the polynomial
λ> f x = 2 * x^3 + 3 * x^2 + 4 * x + 2 -- our polynomial
λ> f 10
2342
λ> diff f 10 -- evaluate df/dx with x=10
664.0
λ> 2*3 * 10^2 + 3*2 * 10 + 4 -- verify derivative at 10
664
We can also compose functions:
λ> f x = 2 * x^2 + 3 * x + 5
λ> f2 = tanh . exp . sin . f
λ> f2 0.25
0.5865376368439258
λ> diff f2 0.25
1.6192873
If you want to learn more about how this works, read the paper by Conal M. Elliott2 or watch the talk, titled "Provably correct, asymptotically efficient, higher-order reverse-mode automatic differentiation" by Simon Peyton Jones himself3, or read their paper4 by the same name.
There's also a package named ad
which implements this in a usable way. This gist is merely to understand the most basic form of it. Additionally, there's Andrej Karpathy's micrograd written in Python.
Footnotes
-
Only the inverse hyperbolic functions aren't yet implemented in the
Floating
instance ↩ -
http://conal.net/papers/beautiful-differentiation/beautiful-differentiation-long.pdf ↩