SERVING THE QUANTITATIVE FINANCE COMMUNITY

• 1
• 5
• 6
• 7
• 8
• 9

Paul
Posts: 8727
Joined: July 20th, 2001, 3:28 pm

### Re: DL and PDEs

If you want a NN to solve the diffusion equation then it's going to have to learn differentiation. So why not train it to differentiate first? Any problems (lack of rigour, pathologies, strange behaviour,...) might become apparent in this simpler problem. And we can be mathematically brutal in trying to find issues, as Devil's Advocates.

Doing something with a normal distribution is a bit irrelevant. You need to train on as many functions as possible.

Cuchulainn
Posts: 57288
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

If you want a NN to solve the diffusion equation then it's going to have to learn differentiation. So why not train it to differentiate first?
OK, let's take this up. I'm having difficulty.
$\frac {\partial T}{\partial t} = a^2\frac{\partial^2 T}{\partial x^2}$
How to solve? what is meant by 'solve'?
What kind of 'differentitation'? in x?

Paul
Posts: 8727
Joined: July 20th, 2001, 3:28 pm

### Re: DL and PDEs

Solve means specify an initial condition and boundary conditions with a=1. But you need to train on thousands of conditions and their corresponding solutions.

But easier to start with just getting a NN to figure out d/dx.

Cuchulainn
Posts: 57288
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

### Re: DL and PDEs

Solve means specify an initial condition and boundary conditions with a=1. But you need to train on thousands of conditions and their corresponding solutions.

But easier to start with just getting a NN to figure out d/dx.
Thinking out loud ..
There are many solutions to a PDE so we can define a canonical solution using a combination of elementary functions and parameters. We could then use Hidden Markov Model to determine the nature of the input signal given the output?
For example, we can write the general solution of a system of ODEs in terms of eigen{values, vectors}, a particular integral and arbitrary constants. We use HMM to compute the latter.
An unfounded remark is that HMM is intuitively more appealing than NN backpropagation for this class of problems.And more robust and mathematically grounded..
What do you think,Paul?