SERVING THE QUANTITATIVE FINANCE COMMUNITY

ISayMoo
Posts: 2273
Joined: September 30th, 2015, 8:30 pm

### Re: DL and PDEs

Well, this paper has been peer-reviewed, but it's even worse (methodologically).
Imho, ML computing can develop its full potential only if it parts with the reductionist approach. The paper sounds like a first step in the right direction.
BTW, why computer science bros don't try to apply Galois connection in ML? They should be familiar with formal concept analysis. I'll ask this question on linkedin, where everybody (except me) is an ML expert these days.
They introduce reductionism via back door in the form of the sparseness-inducing penalty (IMHO); I gathered this from a quick skim - nobody reads papers in detail these days except the few remaining pros like you).

ISayMoo
Posts: 2273
Joined: September 30th, 2015, 8:30 pm

### Re: DL and PDEs

Can’t we do the ML version of the heat equation here and now?
I'm in. Who else is coming on board? And who does what?
IMO if we have solved the heat equation all the way then it would clear things up.
I’m in. What’s the procedure? Naive training on known solutions? Or representing differentiation using DLs?
My recommendation would be to try minimise the "training on known solutions" bit, because in general I think it won't generalise too well beyond the training set.

Cuchulainn
Posts: 61108
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

Some initial/brainstorming ideas on specifying and scoping the problem
One possible avenue is to examine the heat equation (later, any convection-diffusion-reaction PDE..). I feel the method is ideally suited to inverse problems, e.g. computing unknown thermal conductivity, initial condition, boundary conditions. This is a competitor to traditional (numerical) regularization approaches and will hopefully eliminate well-known stability issues due to the ill-posed nature of the problem.
An excellent article IMO (paging Mr. @ISayMoo) is
https://pdfs.semanticscholar.org/4fc5/c ... 72a128.pdf

It is at the very least a good baseline example and no confusion is possible, hopefully.
There’ something in that paper for everyone as many steps can be implemented in different ways. Personally, I like Appendix A as it draws the analogy between the Method of Lines (MOL) and the Hopfield continuous-time neural network (flashback Kirchoff network..). It means that you can potentially use ODE solvers to solve them.
I would like try Differential Evolution (DE), maybe someone with like to try a gradient-based method.
I suggest using this approach but instead of finding the unknown BC based on temperature samples we estimate the thermal conductivity parameter based on temperature samples.
Once an unambiguous spec is drawn up and we are reading from the same page, we can start with a detailed design.
Q: do we use supervised or unsupervised learning? Is NN better than traditional approaches?
Any more on board besides Paul and myself?
Feedback welcome!
http://www.datasimfinancial.com
http://www.datasim.nl

Every Time We Teach a Child Something, We Keep Him from Inventing It Himself
Jean Piaget

Cuchulainn
Posts: 61108
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

Is the following a good one ('running the heat equation backward' thread)

I have a finance problem which boils down to the following math question.
Suppose a function u(x,t) solves the heat equation:
u_xx = u_t subject to u(x,0) = f(x)
where f(x) is the initial termperature profile.
We know that this equation can be propagated forward in t for as big a t as you like.
Now suppose you wish to propagate backwards in t starting from t = 0. The problem is to find the domain for which the solution is well defined. I anticipate that this domain depends on f. Any references on running the heat equation backward?
http://www.datasimfinancial.com
http://www.datasim.nl

Every Time We Teach a Child Something, We Keep Him from Inventing It Himself
Jean Piaget

katastrofa
Posts: 8746
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

### Re: DL and PDEs

Well, this paper has been peer-reviewed, but it's even worse (methodologically).
Imho, ML computing can develop its full potential only if it parts with the reductionist approach. The paper sounds like a first step in the right direction.
BTW, why computer science bros don't try to apply Galois connection in ML? They should be familiar with formal concept analysis. I'll ask this question on linkedin, where everybody (except me) is an ML expert these days.
They introduce reductionism via back door in the form of the sparseness-inducing penalty (IMHO); I gathered this from a quick skim - nobody reads papers in detail these days except the few remaining pros like you).
I cook for this man, in case you wondered why all the fawning

I simply think that a problem like this can be disentangled using PDEs only when you have a strong prior. And when you do, you're already there.

Cuchulainn
Posts: 61108
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

Whatever.
My proposal is unambiguous and consistent with the thread title. Noise just makes people mad. Write up or shut up, por favor!
http://www.datasimfinancial.com
http://www.datasim.nl

Every Time We Teach a Child Something, We Keep Him from Inventing It Himself
Jean Piaget

katastrofa
Posts: 8746
Joined: August 16th, 2007, 5:36 am
Location: Alpha Centauri

### Re: DL and PDEs

You're asking others for references to that? Isn't it how finance people calculate PV? But ok, I'm not interrupting your tête-à-tête with ISayMoo.

Cuchulainn
Posts: 61108
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

A good 1st case to estimate the diffusion coefficient $a^2$

$\frac {\partial T}{\partial t} = a^2( \frac{\partial^2 T}{\partial x^2}$ $+ \frac{\partial^2 T}{\partial y^2})$

and we can reduce the scope even more to

$\frac {\partial T}{\partial t} = a^2\frac{\partial^2 T}{\partial x^2}$

It's also a challenge to all the papers on DL-PDE.
http://www.datasimfinancial.com
http://www.datasim.nl

Every Time We Teach a Child Something, We Keep Him from Inventing It Himself
Jean Piaget

Cuchulainn
Posts: 61108
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

You've all gone awful quiet.
http://www.datasimfinancial.com
http://www.datasim.nl

Every Time We Teach a Child Something, We Keep Him from Inventing It Himself
Jean Piaget

Paul
Posts: 10046
Joined: July 20th, 2001, 3:28 pm

### Re: DL and PDEs

Why look at inverse problems first?

Are forward problems being done properly?

What about simple 1-D differentiation? What is the best way to set that up? How to train it?

Cuchulainn
Posts: 61108
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

No particular reason., just trying to get a handle. We can do the forward training?

I suppose you mean as part of Backpropagation? If yes, then I know

1. Analytic
2. Classic divided differences
4. (Complex Step method)

Not a problem.

BTW why is differentiation needed? Can you give a model example to try out please?
http://www.datasimfinancial.com
http://www.datasim.nl

Every Time We Teach a Child Something, We Keep Him from Inventing It Himself
Jean Piaget

Paul
Posts: 10046
Joined: July 20th, 2001, 3:28 pm

### Re: DL and PDEs

I meant input thousands of functions with the outputs being the derivative of those functions. And then train.

Will it learn to differentiate?

Will it make mistakes?

Can it be "tricked"?

What is the best architecture?

Cuchulainn
Posts: 61108
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

thousands of functions
for example? $x^2$ -> $2x$

and then arbitrarily deep nested exprresions ofarbitrary functions?

e.g. $arsin(sin(x) + exp(x))$

??
http://www.datasimfinancial.com
http://www.datasim.nl

Every Time We Teach a Child Something, We Keep Him from Inventing It Himself
Jean Piaget

Paul
Posts: 10046
Joined: July 20th, 2001, 3:28 pm

### Re: DL and PDEs

Take every function in Abramowitz and S as input, for a variety of parameters as inputs...

Simple differentiation shouldn't require anything too deep. But how do we know that a priori? And indeed what functions to use? (That's one of the many reasons I hate numerics.)

Cuchulainn
Posts: 61108
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

We've drifted from the heat equation? On back-burner.

I won't be able to do all of A&S tonite but what abour 26.2.26 (derivatives of normal pdf).

?
http://www.datasimfinancial.com
http://www.datasim.nl

Every Time We Teach a Child Something, We Keep Him from Inventing It Himself
Jean Piaget