Serving the Quantitative Finance Community

 
User avatar
Cuchulainn
Topic Author
Posts: 20252
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thing in

June 20th, 2019, 10:56 am

https://en.wikipedia.org/wiki/Different ... rogramming



Yeah, Differentiable Programming is little more than a rebranding of the modern collection Deep Learning techniques, the same way Deep Learning was a rebranding of the modern incarnations of neural nets with more than two layers. The important point is that people are now building a new kind of software by assembling networks of parameterized functional blocks and by training them from examples using some form of gradient-based optimization….It’s really very much like a regular program, except it’s parameterized, automatically differentiated, and trainable/optimizable. An increasingly large number of people are defining the networks procedurally in a data-dependent way (with loops and conditionals), allowing them to change dynamically as a function of the input data fed to them. It’s really very much like a regular progam, except it’s parameterized, automatically differentiated, and trainable/optimizable. Dynamic networks have become increasingly popular (particularly for NLP), thanks to deep learning frameworks that can handle them such as PyTorch and Chainer (note: our old deep learning framework Lush could handle a particular kind of dynamic nets called Graph Transformer Networks, back in 1994. It was needed for text recognition). People are now actively working on compilers for imperative differentiable programming languages. This is a very exciting avenue for the development of learning-based AI. Important note: this won’t be sufficient to take us to “true” AI. Other concepts will be needed for that, such as what I used to call predictive learning and now decided to call Imputative Learning. 

Hard to keep up with all them buzzwords (shelf-like not long?), what?
 
multivariatecopula
Posts: 10
Joined: December 24th, 2018, 4:36 am

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

June 20th, 2019, 5:54 pm

Could be useful in areas like computer vision or text translation. Don't see it being useful in finance where the cat's face changes while you train the cat face model.
 
User avatar
Cuchulainn
Topic Author
Posts: 20252
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

June 20th, 2019, 6:36 pm

Could be useful in areas like computer vision or text translation. Don't see it being useful in finance where the cat's face changes while you train the cat face model.
I always wanted to say something like good that. Is that like saying a cat's face is not an ergodic Markov chain?
 
User avatar
mtsm
Posts: 78
Joined: July 28th, 2010, 1:40 pm

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

June 21st, 2019, 11:42 am

Not sure why say it's not relevant in finance. It's very relevant. 

There was a huge body of work being on adjoint differentiation in quant finance a fairly long time ago. Many major IBs have such features in their systems. 

For once, it feels like quant finance was significantly front running the ML community on this one.

Of course, I believe that this stuff probably originated in the academic dynamical systems community a long time ago, before it wa spicked up elsewhere. I think so at least, did not check.
 
User avatar
Cuchulainn
Topic Author
Posts: 20252
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

June 21st, 2019, 12:22 pm

AD is a useful trick to compute derivatives as part of some algorithm that needs to compute them. 
Leibniz 1664 was the first and AD has been used in fluid flow engineering years before finance.Some recent articles on ANN (e.g. McGhee) discard AD in favour of cublc splines, which is a bit weird..AD is in its infancy, manually is error-prone  and scales badly. I've read somewhere (Pearlmutter)

 AFAIR Mike Giles in 2006 introduced it.

The 1st PhD on AD was Wengert 1964.
https://dl.acm.org/citation.cfm?id=364791

And then there's the Complex Step Method which quants don't use?
 
User avatar
Cuchulainn
Topic Author
Posts: 20252
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

September 12th, 2019, 9:42 am

AD is not necessary?
 
User avatar
katastrofa
Posts: 7440
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

September 12th, 2019, 11:33 am

In what sense AD scales badly? I use Sacado library. Never failed me.

BTW, you forgot that Complex Step Method is called Higham method these days :-)
I only can't remember who developed it, computer science or machine learning dudes...
 
User avatar
Cuchulainn
Topic Author
Posts: 20252
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

September 12th, 2019, 11:47 am

In what sense AD scales badly? I use Sacado library. Never failed me.

BTW, you forgot that Complex Step Method is called Higham method these days :-)
I only can't remember who developed it, computer science or machine learning dudes...
Not true, neither

1. Squire and Trapp (1998?). IMO Higham has extended it to matrices.
2. Not really, CSM was developed in aeronautical etc. as Monsieur Farid has told the forum. In fairness, CS 'dudes' (as you call them) do not be trained in developing new numerical schemes. An exception is 

https://en.wikipedia.org/wiki/John_G._F._Francis
 
User avatar
katastrofa
Posts: 7440
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

September 12th, 2019, 12:36 pm

Not in good style to quote oneself, but it's important to go back to the source (of the method):

katastrofa:
I don't know who "discovered" complex derivatives, but their applications in different disciplines has been popular for a while. I find it strange that any of these is attributed to some 56-year-old sapling.
"Numerical algorithms based on the theory of complex variable", Lyness, 1967 (paywall)
"Numerical Differentiation of Analytic Functions", Lyness & Moler, 1967 (PDF)
 
User avatar
FaridMoussaoui
Posts: 327
Joined: June 20th, 2008, 10:05 am
Location: Genève, Genf, Ginevra, Geneva

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

September 12th, 2019, 12:56 pm

Well, Agnieszka is right. What I said is that CSM was used in large scale computing framework in CFD (that's was in the late 90'/early 2000).
 
User avatar
Cuchulainn
Topic Author
Posts: 20252
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

September 12th, 2019, 1:38 pm

I find CSM is nowhere like Lyness/Moler application of the Cauchy integral formula, unless they both use complex arithmetic. Maybe I missed something.
 
User avatar
FaridMoussaoui
Posts: 327
Joined: June 20th, 2008, 10:05 am
Location: Genève, Genf, Ginevra, Geneva

Re: Do you agree with Yann LeCun about Deep Learning being dead and "Differentiable programming" being the next hot thin

September 12th, 2019, 1:53 pm

Well, Lyness & Moler (1967) were the first to use the complex variables to develop estimates for derivatives.
This pionnering work led Squire & Trapp (1998) to develop the complex step method for estimating the first derivative.