Page 8 of 12

### Re: DL and PDEs

Posted: August 27th, 2018, 11:20 am
Well, this paper has been peer-reviewed, but it's even worse (methodologically).
Imho, ML computing can develop its full potential only if it parts with the reductionist approach. The paper sounds like a first step in the right direction.
BTW, why computer science bros don't try to apply Galois connection in ML? They should be familiar with formal concept analysis. I'll ask this question on linkedin, where everybody (except me) is an ML expert these days.
They introduce reductionism via back door in the form of the sparseness-inducing penalty (IMHO); I gathered this from a quick skim - nobody reads papers in detail these days except the few remaining pros like you).

### Re: DL and PDEs

Posted: August 27th, 2018, 11:24 am
Can’t we do the ML version of the heat equation here and now?
I'm in. Who else is coming on board? And who does what?
IMO if we have solved the heat equation all the way then it would clear things up.
I’m in. What’s the procedure? Naive training on known solutions? Or representing differentiation using DLs?
My recommendation would be to try minimise the "training on known solutions" bit, because in general I think it won't generalise too well beyond the training set.

### Re: DL and PDEs

Posted: August 27th, 2018, 12:02 pm
Some initial/brainstorming ideas on specifying and scoping the problem
One possible avenue is to examine the heat equation (later, any convection-diffusion-reaction PDE..). I feel the method is ideally suited to inverse problems, e.g. computing unknown thermal conductivity, initial condition, boundary conditions. This is a competitor to traditional (numerical) regularization approaches and will hopefully eliminate well-known stability issues due to the ill-posed nature of the problem.
An excellent article IMO (paging Mr. @ISayMoo) is
https://pdfs.semanticscholar.org/4fc5/c ... 72a128.pdf

It is at the very least a good baseline example and no confusion is possible, hopefully.
There’ something in that paper for everyone as many steps can be implemented in different ways. Personally, I like Appendix A as it draws the analogy between the Method of Lines (MOL) and the Hopfield continuous-time neural network (flashback Kirchoff network..). It means that you can potentially use ODE solvers to solve them.
I would like try Differential Evolution (DE), maybe someone with like to try a gradient-based method.
I suggest using this approach but instead of finding the unknown BC based on temperature samples we estimate the thermal conductivity parameter based on temperature samples.
Once an unambiguous spec is drawn up and we are reading from the same page, we can start with a detailed design.
Q: do we use supervised or unsupervised learning? Is NN better than traditional approaches?
Any more on board besides Paul and myself?
Feedback welcome!

### Re: DL and PDEs

Posted: August 27th, 2018, 12:29 pm
Is the following a good one ('running the heat equation backward' thread)

I have a finance problem which boils down to the following math question.
Suppose a function u(x,t) solves the heat equation:
u_xx = u_t subject to u(x,0) = f(x)
where f(x) is the initial termperature profile.
We know that this equation can be propagated forward in t for as big a t as you like.
Now suppose you wish to propagate backwards in t starting from t = 0. The problem is to find the domain for which the solution is well defined. I anticipate that this domain depends on f. Any references on running the heat equation backward?

### Re: DL and PDEs

Posted: August 27th, 2018, 12:35 pm
Well, this paper has been peer-reviewed, but it's even worse (methodologically).
Imho, ML computing can develop its full potential only if it parts with the reductionist approach. The paper sounds like a first step in the right direction.
BTW, why computer science bros don't try to apply Galois connection in ML? They should be familiar with formal concept analysis. I'll ask this question on linkedin, where everybody (except me) is an ML expert these days.
They introduce reductionism via back door in the form of the sparseness-inducing penalty (IMHO); I gathered this from a quick skim - nobody reads papers in detail these days except the few remaining pros like you).
I cook for this man, in case you wondered why all the fawning

I simply think that a problem like this can be disentangled using PDEs only when you have a strong prior. And when you do, you're already there.

### Re: DL and PDEs

Posted: August 27th, 2018, 1:55 pm
Whatever.
My proposal is unambiguous and consistent with the thread title. Noise just makes people mad. Write up or shut up, por favor!

### Re: DL and PDEs

Posted: August 27th, 2018, 5:29 pm
You're asking others for references to that? Isn't it how finance people calculate PV? But ok, I'm not interrupting your tête-à-tête with ISayMoo.

### Re: DL and PDEs

Posted: August 28th, 2018, 9:23 am
A good 1st case to estimate the diffusion coefficient [$] a^2[$]

[$]\frac {\partial T}{\partial t} = a^2( \frac{\partial^2 T}{\partial x^2} [$] [$] + \frac{\partial^2 T}{\partial y^2})[$]

and we can reduce the scope even more to

[$]\frac {\partial T}{\partial t} = a^2\frac{\partial^2 T}{\partial x^2} [$]

It's also a challenge to all the papers on DL-PDE.

### Re: DL and PDEs

Posted: August 29th, 2018, 10:20 am
You've all gone awful quiet.

### Re: DL and PDEs

Posted: August 29th, 2018, 11:28 am
Why look at inverse problems first?

Are forward problems being done properly?

What about simple 1-D differentiation? What is the best way to set that up? How to train it?

### Re: DL and PDEs

Posted: August 29th, 2018, 12:49 pm
No particular reason., just trying to get a handle. We can do the forward training?

I suppose you mean as part of Backpropagation? If yes, then I know

1. Analytic
2. Classic divided differences
4. (Complex Step method)

Not a problem.

BTW why is differentiation needed? Can you give a model example to try out please?

### Re: DL and PDEs

Posted: August 29th, 2018, 12:58 pm
I meant input thousands of functions with the outputs being the derivative of those functions. And then train.

Will it learn to differentiate?

Will it make mistakes?

Can it be "tricked"?

What is the best architecture?

### Re: DL and PDEs

Posted: August 29th, 2018, 1:11 pm
thousands of functions
for example? [$]x^2[$] -> [$]2x[$]

and then arbitrarily deep nested exprresions ofarbitrary functions?

e.g. [$]arsin(sin(x) + exp(x))[$]

??

### Re: DL and PDEs

Posted: August 29th, 2018, 2:01 pm
Take every function in Abramowitz and S as input, for a variety of parameters as inputs...

Simple differentiation shouldn't require anything too deep. But how do we know that a priori? And indeed what functions to use? (That's one of the many reasons I hate numerics.)

### Re: DL and PDEs

Posted: August 29th, 2018, 3:26 pm
We've drifted from the heat equation? On back-burner.

I won't be able to do all of A&S tonite but what abour 26.2.26 (derivatives of normal pdf).

?