SERVING THE QUANTITATIVE FINANCE COMMUNITY

 
User avatar
Cuchulainn
Topic Author
Posts: 59215
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thing in

June 20th, 2019, 10:56 am

https://en.wikipedia.org/wiki/Different ... rogramming



Yeah, Differentiable Programming is little more than a rebranding of the modern collection Deep Learning techniques, the same way Deep Learning was a rebranding of the modern incarnations of neural nets with more than two layers. The important point is that people are now building a new kind of software by assembling networks of parameterized functional blocks and by training them from examples using some form of gradient-based optimization….It’s really very much like a regular program, except it’s parameterized, automatically differentiated, and trainable/optimizable. An increasingly large number of people are defining the networks procedurally in a data-dependent way (with loops and conditionals), allowing them to change dynamically as a function of the input data fed to them. It’s really very much like a regular progam, except it’s parameterized, automatically differentiated, and trainable/optimizable. Dynamic networks have become increasingly popular (particularly for NLP), thanks to deep learning frameworks that can handle them such as PyTorch and Chainer (note: our old deep learning framework Lush could handle a particular kind of dynamic nets called Graph Transformer Networks, back in 1994. It was needed for text recognition). People are now actively working on compilers for imperative differentiable programming languages. This is a very exciting avenue for the development of learning-based AI. Important note: this won’t be sufficient to take us to “true” AI. Other concepts will be needed for that, such as what I used to call predictive learning and now decided to call Imputative Learning. 

Hard to keep up with all them buzzwords (shelf-like not long?), what?
 
multivariatecopula
Posts: 8
Joined: December 24th, 2018, 4:36 am

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

June 20th, 2019, 5:54 pm

Could be useful in areas like computer vision or text translation. Don't see it being useful in finance where the cat's face changes while you train the cat face model.
 
User avatar
Cuchulainn
Topic Author
Posts: 59215
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

June 20th, 2019, 6:36 pm

Could be useful in areas like computer vision or text translation. Don't see it being useful in finance where the cat's face changes while you train the cat face model.
I always wanted to say something like good that. Is that like saying a cat's face is not an ergodic Markov chain?
 
User avatar
mtsm
Posts: 350
Joined: July 28th, 2010, 1:40 pm

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

June 21st, 2019, 11:42 am

Not sure why say it's not relevant in finance. It's very relevant. 

There was a huge body of work being on adjoint differentiation in quant finance a fairly long time ago. Many major IBs have such features in their systems. 

For once, it feels like quant finance was significantly front running the ML community on this one.

Of course, I believe that this stuff probably originated in the academic dynamical systems community a long time ago, before it wa spicked up elsewhere. I think so at least, did not check.
 
User avatar
Cuchulainn
Topic Author
Posts: 59215
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

June 21st, 2019, 12:22 pm

AD is a useful trick to compute derivatives as part of some algorithm that needs to compute them. 
Leibniz 1664 was the first and AD has been used in fluid flow engineering years before finance.Some recent articles on ANN (e.g. McGhee) discard AD in favour of cublc splines, which is a bit weird..AD is in its infancy, manually is error-prone  and scales badly. I've read somewhere (Pearlmutter)

 AFAIR Mike Giles in 2006 introduced it.

The 1st PhD on AD was Wengert 1964.
https://dl.acm.org/citation.cfm?id=364791

And then there's the Complex Step Method which quants don't use?
ABOUT WILMOTT

PW by JB

Wilmott.com has been "Serving the Quantitative Finance Community" since 2001. Continued...


Twitter LinkedIn Instagram

JOBS BOARD

JOBS BOARD

Looking for a quant job, risk, algo trading,...? Browse jobs here...


GZIP: On