Differentiable Programming - Yann LeCunn
From <a href="https://www.facebook.com/yann.lecun/posts/10155003011462143">here</a>: | |
<h1> | |
Yann LeCun | |
</h1> | |
<i>5 January</i> | |
<p>OK, Deep Learning has outlived its usefulness as a buzz-phrase. Deep Learning est mort. Vive Differentiable Programming!</p> | |
<p>Yeah, Differentiable Programming is little more than a rebranding of the modern collection Deep Learning techniques, the | |
same way Deep Learning was a rebranding of the modern incarnations of neural nets with more than two layers.</p> | |
<p>But the important point is that people are now building a new kind of software by assembling networks of parameterized functional | |
blocks and by training them from examples using some form of gradient-based optimization.</p> | |
<p>An increasingly large number of people are defining the networks procedurally in a data-dependent way (with loops and conditionals), | |
allowing them to change dynamically as a function of the input data fed to them. It's really very much like a regular progam, | |
except it's parameterized, automatically differentiated, and trainable/optimizable. Dynamic networks have become increasingly | |
popular (particularly for NLP), thanks to deep learning frameworks that can handle them such as PyTorch and Chainer (note: | |
our old deep learning framework Lush could handle a particular kind of dynamic nets called Graph Transformer Networks, back | |
in 1994. It was needed for text recognition).</p> | |
<p>People are now actively working on compilers for imperative differentiable programming languages. This is a very exciting | |
avenue for the development of learning-based AI.</p> | |
<p>Important note: this won't be sufficient to take us to "true" AI. Other concepts will be needed for that, such as what I | |
used to call predictive learning and now decided to call Imputative Learning. More on this later....</p> |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment